Aws lambda scrapingJobs
...referenced link to develop an AWS CDK (Cloud Development Kit) application using TypeScript. This application will deploy a monitoring solution for ECS (Amazon Elastic Container Service) stopped tasks. Our main focus will be on capturing and querying exit codes in CloudWatch Logs, so this solution will provide comprehensive details about the task, including the 'exit code' and 'reason'. Deliverables: - TypeScript codebase for the AWS CDK application. - Deployment documentation with step-by-step instructions.(readmeFile) - Guidelines for querying CloudWatch Logs to retrieve ECS stopped task information. - Optional: Recommendations for future improvements or optimizations. Completion Criteria: - The CDK application is successfully ...
I am on the lookout for a seasoned EKS Kubernetes expert. My primary tasks that need help with include deployment and scaling, configuration and management, troubleshooting...deployment and scaling, configuration and management, troubleshooting, debugging, and key upgrades. The EKS upgrades should ideally empower our system with improved performance, as well as apply important bug fixes and patches. The current Kubernetes platform I'm utilizing is on the cloud, specifically AWS, Azure, and GCP. Ideal Skills & Experience: - Proficiency in deploying, scaling, and managing Kubernetes clusters - Expertise in troubleshooting and debugging on Kubernetes - Familiarity with AWS, Azure, and GCP platforms - Proven track record of successful Kubernetes upgrades improving pe...
We are looking for a DevOps Engineer to join our team of technical consultants. You would be involved in solving technical problems and providing daily support. The task is primarily based on: 1. AWS Transit Gateway 2. Fortinet Firewall 3. AWS Solution Architect Role Also, you will be responsible for designing, building, and deploying cloud-based infrastructure and ensuring that the development and deployment process runs smoothly. You will also be responsible for maintaining and improving the security, reliability, and scalability of our systems. If you think you are the right candidate for this position, please submit a proposal describing how you can help with this project.
I need an expert in web scraping, either in Python or by utilizing Google Chrome extension, to extract text data from a particular website. Should be delivered within 3 hours. The scraping is a one-time task, not requiring routine updates, and the extracted data needs to be saved in an Excel file (or CSV). This will be hidden behind a login wall - so I suggest that the Python script will go to the website and then wait one minute, where I will manually log in. Ideal Skills and Experiences: - Proficiency in Python for web scraping - Experience with scraping text specifically - Ability to deliver the required output in a CSV format
I require the expertise of a skilled data scraper who can quickly retrieve text data from a specified website. The site in question...text data from a specified website. The site in question is I need the following specific information delivered in Microsoft Excel Format: - Country Name - Grand Lodge Name - Website address - Email address - Phone - Address - Grand Master name (if available) The ideal freelancer for this project would have significant experience in data scraping, particularly text data, and an understanding of the website's structure to retrieve the desired information effectively. Speed and efficiency are of the essence. Your ability to deliver this project promptly without compromising accuracy will win you this bid.
I am in need of a Python script that can extract Amazon's official keywords and search terms, and update this information on a daily basis. This data includes keyword search volume, ke...search terms - Ensuring the script is capable of updating this data on a daily basis - Retrieving keyword search volume, keyword competition, and top ranking keywords - Aiming for improved product visibility, enhanced product listings, and optimized PPC campaigns The ideal freelancer for this project should have: - Proficiency in Python - Familiarity with web scraping and data extraction - Experience with Amazon's API or web scraping tools - Understanding of keyword research and e-commerce optimization Please only bid if you can provide a high-quality and efficient Python script ...
I'm in need of a proficient developer who can build me a data grabber. Ideally, you should have experience in creating software that retrieves specific text data from various websites. The tool should be capable of grabbing data from one page per site, no need for multiple page crawling. Your experience in web scraping and data mining would be advantageous, as well as having: - Profound knowledge in programming languages suitable for web scraping (Python, Java, etc.) - Familiarity with websites structure and HTML - Experience with data extraction tools and software - Proven background in creating custom data grabbers Your role will not only involve the creation of the tool but also ensuring it functions well without crashing or skipping important data. If you can del...
I need an experienced Android developer to create an IoT app. - The primary function of the app will be asset tracking within a distributed control system. Ideal Skills: - Java based development for Android application - Able to use the resources/SDK at Expected output: - Very basic app to demo one message publish to AWS topic and receive one message from subscribed topic - Demonstrate the mobile app connecting to AWS IoT and Sub-Pub on a topic - Use TLS certs (client, private and root) Looking forward to your bids.
I'm seeking a proficient DevOps developer who can skilfully set up a Linux-based Virtual Private Server (VPS) on Amazon Web Services (AWS). Configuring Nginx is also a key part of this project. and successfully run websocket code written in PHP Your responsibilities will cover: - Configuration of Nginx to ensure optimal performance Given that the expected daily user traffic is low (less than 10K users), the setup should be optimized for this range. Ideal Skills and Experience: - Extensive experience with VPS and Linux - Profound knowledge of Nginx configuration - Experience with handling low-traffic servers - Handling Ratchel web socket is iportent Your expertise in these areas will be invaluable to ensuring the seamless running and maintenance of the server. I am coun...
Description: ...development and deployment on the AWS cloud. Requirements: Integration with Video Meeting Platforms: The program should connect to popular video meeting platforms including Zoom, Microsoft Teams, and BigBlueButton. It should support joining meetings as a participant and capturing the audio stream from these meetings. Audio Capture: The program must capture audio from the meetings in real-time. Streaming to Transcription Engine: Once the audio is captured, the program should stream it to the transcription engine. Integration for audio streaming should be through a WebSocket connection. Concurrent Handling: The program should be capable of handling at least 15 concurrent audio streams from different meetings. AWS Deployment: The program should be deployed ...
...skilled and experienced Freelance DevOps Engineer to join our dynamic team. This role is ideal for someone passionate about automating and improving development and production environments, and freeing up developers from the complexities of DevOps tasks. Your expertise in AWS, containerization, orchestration, continuous integration, and continuous deployment will be pivotal in our operations. **Key Responsibilities:** - Design, implement, and manage CI/CD processes and pipelines. - Deploy, automate, maintain, and manage AWS cloud-based production systems to ensure availability, performance, scalability, and security. - Problem solve and troubleshoot production issues, ensuring swift resolution. - Manage Docker container setups and Kubernetes orchestrating the deployment, s...
Conduct market research on AI applications in a chosen niche. Select any niche and pose relevant research questions. Use skills like data scraping, APIs, and data cleansing to gather information. Generate a comprehensive report including three graphs.
I'm on the lookout for an experienced no-code specialist who can efficiently design a robust solution for my project. The primary task entails developing a Chat-GPT system that can promptly respond to emails. Key Requirements: - Bases on SaaS solution ( ...significant experience dealing with no-code solutions for overwhelming workload management Skills & Experience: - Proficiency in no-code programming - Previous experience with email automation - Extensive understanding of GPT models - Strong problem-solving aptitude - Demonstrable project completion records Your role will be pivotal to provide a detailed process : - domain name creation ( from aws or google) - dns email setting - nocode plateforme : detailed (step by step setting : with screen shot). Any good idea...
I need an expert to plan, scale, and execute the setup of a Windows Terminal Server to accommodate 50-70 users. The task is expected to be completed within 1 - 3 months. Key Responsibilities: - Plan and execute the setup of a Windows Terminal Server for 50-70 users. - Ensure the server is scalable for potential future growth - Implement be...setting up Windows Terminal Servers for medium to large user bases - Strong understanding of Windows Server and Terminal Services - Proficiency in planning and executing Windows Server setups - Ability to optimize system performance and scalability - Familiarity with security best practices in Terminal Server environments - Deep understanding of AWS services relevant to hosting Windows servers, such as EC2, RDS, IAM, and VPC, as it will be hos...
...flexible schedule to accommodate part-time work hours. The ideal candidate will possess a strong background in QuickBooks Desktop integration, with proficient coding skills in C#, experience in working with Salesforce, XML, and AWS. This role requires approximately 10-15 hours of commitment per month. Key Requirements: Proven experience in integrating QuickBooks Desktop with various platforms. Strong programming skills in C#. Experience in working with Salesforce. Proficient in XML for data manipulation and integration. Familiarity with Amazon Web Services (AWS) environment. Ability to work independently and manage time effectively. Comfortable to committing 15-20 hours in a month This part-time QuickBooks Integration Specialist role offers flexibility in terms of wo...
I'm looking for a Linux server administrator with experience in AWS to help maintain and update. Key Responsibilities: - Regular maintenance: We need someone to ensure all the servers are running smoothly. This includes troubleshooting any performance issues and addressing them promptly. - Updates: Keeping all software, including the Linux systems and other server software, up-to-date is crucial for security and performance. We are looking for a Linux server administrator with expertise in AWS to manage our server infrastructure. The responsibilities include ensuring the smooth operation of the servers and addressing any issues that may arise within our custom-built system. We will provide comprehensive knowledge and training about our system. The workload is expected ...
I need someone to scrape data from a webpage. This is a simple data scraping project that requires extracting specific information from a webpage. The details of the specific data are not provided in this form, but I can provide them directly to interested freelancers. Ideal skills and experience: - Proficient in web scraping techniques - Experience with data extraction from websites - Attention to detail to ensure accuracy of scraped data - Ability to follow direction and deliver according to requirements Please note: I didn't specify the data that needs to be scraped in this form, but will provide it to the freelancer directly.
Need to deploy a simple Flask and NextJS project using AWS EC2 free tier. I will provide you with dev access to my aws, and will share the code with you. And then write the documentations step by step from SSH into the instance all the way to making sure it is working, like the example provided. In the attachment
I'm searching for an experienced AWS CDK Developer with strong TypeScript skills. - Your main task will be to set up a CDK infrastructure that builds a CodePipeline. - The infrastructure must be set up in a way that is efficient, scalable, and easy to maintain. Your expertise in TypeScript will be crucial as you will need to write the CDK code using this language. In addition, you will need to integrate the CodePipeline with our GitLab repository. Ideal Skills and Experience: - Proficiency in TypeScript - Prior experience with AWS CDK - Strong understanding of CodePipeline - Experience with GitLab integration - Strong problem-solving skills to ensure the infrastructure is scalable and efficient. Please provide examples of similar projects you've completed. ...
...looking to automate a series of web scraping tasks using Puppeteer, and wish to build an API to manage and trigger these tasks. I plan to use AWS Lambda to execute these scripts and API Gateway to provide access to the functions. Key requirements for the project include: - Setting up AWS Lambda to run Puppeteer scripts for web scraping. - Configuring those scripts to automate data gathering and downloading. - Creating an API on API Gateway to manage the tasks, trigger the scripts, and return results. -it should bifurcates any social media links on the website - it should return a json with website text, links, social media texts. The ideal candidate would be a developer with: - Experience in AWS Lambda and API Gateway. - Prior wo...
I need support for day to day python project Skills req: Python, AWS cloud, Pandas Need longterm support
its a simple aws ec2 related work you must have knowledge in aws, ec2 deployment
I'm on the hunt for a skillful coder proficient in web scraping, specifically for Website A. The goal of this project is for obtaining valuable and significant data for analysis, competitive intelligence, and price comparison. While there's no explicit mention of the type of data to be scraped, I anticipate that a capable offering will cater to a variety of needs such as: - Text content - Image content - Metadata Ideal Experience and Skills: - Proven experience with web scraping - High proficiency in handling large data sets - Comfort in navigating Website A - Ability to collect comprehensive data categories Looking forward to a potential collaboration to achieve this goal!
...Learning Engineer who can migrate our models from RStudio to AWS Sagemaker. Your task will be to: - Understand the Rstudio models I have running locally. It is one Rstudio logistic regression model with several other codes for the data transformation part-, all in Rstudio. R. - Migrate The Rstudio models to AWS SAGEMAKER. Data will be on S3. -Prepare models to run on sagemaker totally, so that we can do training and testing 100% on sagemaker.-Models are already running on a local computer, but I need to move them to Sagemaker 100%. Data is on S3 already. -You need to configure and prepare Sagemaker from end to end, and teach me how you did it, since I need to replicate it in another system. -I will give you the data and access to AWS Ideal Skills and Experience:...
...Level: Expert only experienced people I seek an experienced developer to build a server-based application for automatically securing delivery blocks offered through the Amazn Flex platform. The goal is to programmatically monitor for and secure time slots before other users are able to manually. Drawing from examples like Compinche, Flexbee, and Thunderflex, the solution should incorporate web scraping and notification capabilities to effectively grab blocks in a timely manner. Professional proficiency working with similar programs that automate processes through APIs or web monitoring is required. Strong Python or similar scripting skills are necessary to develop the network-level monitoring and response automation required to outpace manual reservation. Solid website develo...
I'm looking for a skilled developer to create a Firefox extension (add-on) that will use both simple scraping and Chat GPT API in order to send customized prompts (based on the scraped elements) and receive responses. The workflow will be very simple: 1. The user will come to a website and start the extension, which will extract 5 different text elements (these will be pre-selected). 2. After that, the extension will automatically modify the prompt (input the extracted data to the template prompt) and then send it to Chat GPT API. 3. It will display the response from Chat GPT API.
Hi, I want somebody to build an alerting solution that will send me an email instantly each time a crypto project creates contracts on more than one of the following chains. With the same contract address. BnB ETH Sol Arb Avax Base Optimism I also want the data to output to a spreadsheet also in real time. In addition I want alerts and emails sent when; 1. liquidity is added to one of the contract addresses that meets the above criteria 2. Liquidity is removed from one of the contract addresses that meets the above criteria. I would like the routine to run regularly.
In view of capturing a comprehensive B2B database, I am seeking an experienced web scraper who can compile a list of Building Mate...Louisiana, and Mississippi. Task Essentials: - Collate business data including the Business Name, Business Address, Phone Number, and Email. - Scraping must be performed on specific websites that I will provide. Ideal Skills & Experience: - Expertise in web scraping - Exposure to B2B data collection - Familiarity with the Building Material Supply/Lumber Sales industry - Knowledge of ethical web scraping practices. Your sound abilities coupled with your effective communication skills will be the beacon for this project's success. I look forward to interacting with insightful professionals that are keen to display their skills...
...with Python (Flask), Vue3, Docker and an AWS account, then this is the job you've been looking for. If you're associated with an agency, this is not the job you've been looking for - we are only interested in independent freelancers. No one associated with an agency will be considered. We have a rare opening for a full-time, long-term position working on a variety of projects both in-house and for clients. Our agile approach means that you can work pretty much whenever you like since getting things done is more important than simply showing up. You'll be working with and on our standard architecture, which is Vue3 + pinia on the frontend and Python + Flask + Docker on the backend. Please note that this position requires that you have your own AWS ac...
As a client, I'm on the hunt for a professional Cloud Engineer who excels in AWS Sagemaker and RStudio. Responsibilities: - Primarily, your responsibility will rest on deploying a R project to AWS. Skills and Experience: - A concrete understanding of data preprocessing and model training is necessary, although the principal task is model deployment. - Sound comprehension of deploying a model that utilizes clustering, in the realm of unsupervised learning, will be advantageous. - Prior experience with AWS Sagemaker and RStudio is of the essence. The perfect candidate will be well versed in the advanced use of AWS Sagemaker and RStudio to successfully deploy a supervised learning model.
In this project, I am seeking an expert in data mining who is proficient in using Python libraries such as BeautifulSoup. Your task is to scrape names and contact details from numerous business directories. Ideally, you will have: - Experience using Python for web scraping, especially with BeautifulSoup. - An understanding of different business directory structures and the ability to adapt your scraping methods as necessary. - Exceptional attention to detail to ensure all data scraped is accurate and relevant. - Respect for the privacy and ethical standards associated with data extraction. Your role will involve: - Identifying key data on business directories: primarily names and contact details. - Creating efficient, reusable, and reliable Python code to scrape this inf...
...management, data analytics, and financial or capital markets applications. - Experience with third-party service and API development and integration. - Strong problem-solving skills and an eye for detail. - Excellent communication and project management abilities. - Preferable knowledge in Angular, c# and python for backend and MY SQL - SaaS must be easily managed to run in Cloud Providers like Azure or AWS. If you think you're the right person for this job, I'd love to hear from you. Figma: Sample of one of the 3 profiles:
I immediately need an experienced freelancer to scrape specific numerical data from a website for me and in Json or XML files and should be updated each 500 ms. The project needs to be completed as soon as possible. Key requirements for the job participation are: - Proven experience in JavaScript - A strong understanding and prior experience with data scraping, specifically numeric data. - Availability to start the project right away and ensure its swift completion. These skills will be paramount to successfully get the wanted information quickly and effectively.
...looking for a proficient developer with expertise in React.js and Node.js to design and implement an innovative web application with a dual focus. The primary functionality should be searching and aggregating information, united with web scraping and text extraction features. The application should: - Incorporate a connection to search APIs: Google, Bing and others. - Generate search parameters primarily based on keywords, plus an additional layer incorporating combination match words such as 'fraud', "charge" etc. - Develop an intelligent AI web scraping feature to extract text and info based on search results. - Consider additional features such as search parameters that can be controlled via data sources, sorting options and language filters. Ideal...
I need a software that can provide me with daily updates of opening and clossing odds from a similar site. Ideal Skills & Experience: - Proficient in web scraping - Expertise in data extraction and manipulation - Prior experience with similar odds comparison websites - Strong software development skills - Ability to deliver a detailed project proposal The software should be capable of extracting and presenting the opening and clossing odds from these three specified betting companies in a format that I can load onto my website. Please include a detailed proposal in your application, outlining how you plan to achieve this.
I. Project Description I'm looking for a freelancer to create an automated newsletter system that works with PDF files. This system will: 1. Data Scraping: Extract data from the given PDFs, including information in tables and XBRL format. 2. Keyword Analysis: Analyse the extracted data based on keywords I provide and remove PDFs that don’t meet the criteria. 3. AI Summarisation: Use AI to get summaries of relevant sections identified through keyword analysis after connecting it to an API. 4. Newsletter Generation: Put together a newsletter that includes: Summaries of the data points with references to the original PDF files along with the original table and URL links of the PDF scraped II. Requirements • Experience with PDF processing libraries like PyPDF2 or Apa...
Vær venlig at Tilmelde dig eller Log ind for at se detaljer.
Currently, I'm working with Python Selenium and attempting to create a script that can consistently log in to websites like vfsglobal, such as "", every minute without triggering their IP blocking mechanism (which kicks in after a..."", every minute without triggering their IP blocking mechanism (which kicks in after about 5 attempts from the same IP). I've purchased proxies from and integrated them into my script, but unfortunately, they only seem to work for HTTP, not HTTPS. I'm curious about a solution to this problem. Is it feasible to set up a server (like Google Cloud or AWS) that utilizes a unique IP addresse for each request? Any insights you could provide would be greatly appreciated.
I need to upload log files in on prem to an OpenSearch service in AWS
I’m building a new app via: - VS Code + Github - AWS: Amplify, S3, DynamoDB, Cloudfront I don’t need someone to build, but to explain to me how I can build myself.
We are seeking a skilled web scraping expert to scrape html pages and convert them into a pdf file. The selected candidate will be responsible for creating a web scraper script that can automatically extract and organize data from various websites. The data should be neatly formatted into a pdf document for easy accessibility. The website is in which there are various links - section - 1, section - 2, etc. The text which is present when a link is opened is to be scraped. There may be an odd number as well, like section - 5A. Also, there are pages which will give more links/sections (there are upto 93 pages). When the section is opened, there may be footnotes. The text when the footnote is opened is to be scraped if possible.
Description: I'm seeking a skilled Python developer proficient in web automation and scraping to assist with a project involving logging into vfsglobal links, such as , an infinite number of times without triggering website blocking mechanisms. The freelancer should possess the expertise necessary to circumvent the website's restrictions, as repeated logins (typically 4-10 times per day) currently result in login prevention. Requirements: Proficiency in Python web automation and scraping Experience with handling website login restrictions and bypassing measures Ability to integrate proxies into the script (proxies will be provided) Attention to detail and commitment to delivering high-quality, functional code Willingness to undergo testing and revisions
..."Cricket Exchange." Budget: Our budget for this project is approximately 1.5 lacs INR, and the inclusion of the mandatory floating widget is non-negotiable. Technology Stack: Mobile App Development: Native (iOS and Android) or Cross-platform (React Native, Flutter). Backend: Node.js, Django, Ruby on Rails, or any preferred backend technology. Database: MongoDB, MySQL, or PostgreSQL. Cloud Services: AWS, Google Cloud, or Azure. Monetization Strategies: Advertisements: Banner ads and interstitial ads. Premium Subscription: Offer an ad-free experience and additional features. In-App Purchases: Introduce exclusive content or features for users. We invite qualified developers to submit detailed proposals by 28th Feb 2024. We look forward to reviewing your proposals and em...
I'm looking for a skilled professional who can crop the complete content of a website and save it in a zip file to be deployed on a new server. The website is a small Wordpress with about 41 pages (as per sitemap). Content must not be saved as Wordpress, but static html to be deployed on another server. Key Requirements: - Crop all website content including subpages. - Save the cropped content in a compressed zip file. - The purpose of this cropping is to save the content for future use. - all files must be included, esprecially but not limited to images, js, css, ... - inner paths must be relative paths. If you're confident in your ability to deliver this and can ensure all the content is correctly saved, please bid accordingly. IMPORTANT READ THIS: Auto-generated bids wil...
I have the project with me just needed to deploy it o AWS.
I'm looking for a professional with advanced skills in data mining and excel to extract specific text information from a website into an excel spreadsheet. This job requires an understanding of website data scraping. The successful candidate should have: - Proven experience with Web Scraping. - Expert knowledge in Data mining and Excel. - The ability to handle large sets of information efficiently and effectively. - Accuracy and attention to detail. - Knowledge of and web scraping ethics. The task primarily involves text data mining, here are the specifics: - Extract exact information from a website. - The exact text data to be mined will be communicated to the successful candidate. This opportunity is ideal for someone with a knack for managing large amounts...
...for further details: > Python (or a python package) is to be used everywhere it possibly can! **Data Collection** Data can be collected (legally) from anywhere. You may use data that you already have; or from sites that allow you to download the data, for example, [UCI Machine Learning Repository]() or [Kaggle Datasets](); or via web scraping; or via an API. We can restrict ourselves to data that would fit nicely into a spreadsheet. The content and amount of data are not the main consideration, as long as the data has: - at least 10 variables - three or more data types - two or more problems: missing data, inconsistencies, errors, categorical data that needs to be converted to numeric, entries like text that need to be converted into
I'm in need of a professional who can scrape specific data from a website onto an Excel spreadsheet. The requirement is to list information in a table format. Key Requirements: - Scraping of content descriptions from the website - Transferring the extracted data into an Excel database The data should be organized in a clear and structured manner, following the format I will provide.
I'm seeking an experienced data scraper to collect and compile contact information, specifically email addresses and fax machine numbers, for doctor's offices in Hamilton, Ontario. The desired freelancer will have: - Proven experience in data scraping and data entry. - Ability to independently source from multiple platforms or websites. - Knowledge and capability to deliver the data in an Excel spreadsheet format. Primary Tasks: - Data scraping for email addresses and fax numbers across various sources. - Ensure accurate, complete data entry into the Excel spreadsheet. - Verification of compiled data to ensure its accuracy. This project is ideal for freelancers with a keen attention to detail and a dedication to data accuracy and integrity. If the quality of...
I'm looking for an adept professional who can create an Android virtualization solution. This solution should be flexible enough to be used on micromachines on AWS Amazon or any other system via Docker or Kubernetes. The end goal is to enable the running of multiple, simultaneous Android cell phones. Key Aspects: - It must support Google Play, Google Drive, and specifically, WhatsApp. - The main purpose of this project is virtual phone farming. - The solution should automate the process of sending and receiving WhatsApp messages. Android virtualization solution, which can be used on Micromachines on AwsAmazon or Any system via Docker or Kubernets, which allows you to run a series of simultaneous cell phones on Android, with Gplay, GdRIVE, Whatsapp, edit version, resolution, m...