
Closed
Posted
Paid on delivery
Backend Engineer – Batch Ingestion Pipeline (CSV → Database) We are building a large-scale book content platform. We are NOT looking for a full-stack developer, UI, or website. We need a backend engineer to build a simple, reliable batch ingestion pipeline. SCOPE Your task is to build a CLI-based ingestion script that: Reads a CSV file with book metadata Inserts records into a PostgreSQL database (Supabase) Handles 10,000–20,000 records efficiently Uses batch inserts (no per-row inserts) Runs outside HTTP (terminal / cron / worker) No frontend. No dashboard. No UI. TECH STACK (Flexible) Node.js or Python PostgreSQL CSV input Environment variables for DB connection OUT OF SCOPE Real-time processing Web UI File upload interfaces EPUB/PDF conversion SEO, frontend, or design DELIVERABLES Clean, documented script Instructions to run locally or on a server Example CSV format Proof that batch insertion works at scale IDEAL PROFILE Experience with data pipelines or batch jobs Comfortable handling large datasets Understands database performance Focused on reliability, not UI
Project ID: 40213606
54 proposals
Remote project
Active 7 days ago
Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
54 freelancers are bidding on average $150 USD for this job

Hi there, We’ve built similar batch ingestion pipelines that process large volumes of data efficiently. For example, we developed a product that ingested 1 million records from Amazon and eBay, using a combination of cron jobs and serverless functions to handle the workload without impacting server performance. We can use either Node.js or Python for this task, depending on which language you prefer. I’m equally proficient in both. With my extensive experience in backend development, I can deliver a robust solution that meets your requirements. Let’s schedule a quick 10-minute call to discuss your project in more detail and ensure I fully understand your needs. I’m looking forward to hearing more about this exciting project. Best, Adil
$115.41 USD in 7 days
7.2
7.2

Hi, I can build a CLI-based batch ingestion pipeline that reliably reads large CSV files and inserts 10k–20k+ book records into PostgreSQL (Supabase) using efficient batch inserts—no per-row operations, no HTTP, no UI. What I’ll deliver: Clean Node.js or Python CLI script (your choice) Optimized batch inserts for high-volume CSV ingestion Robust error handling & logging Environment-based DB configuration Sample CSV + clear run instructions Proof of performance at scale Why I’m a good fit: Strong experience with backend batch jobs & data pipelines Deep understanding of PostgreSQL performance Focused on reliability, simplicity, and maintainability Quick questions: Do you have a finalized DB schema already in Supabase? Any special constraints (upserts, deduplication, validations)? Ready to start immediately.
$250 USD in 7 days
6.7
6.7

Hi, I can build the code for this.I have 8 years of experience in Software Engineering. Lets connect.I have worked with more than 112+ clients. Looking forward
$100 USD in 2 days
6.4
6.4

⭐Hello [ClientFirstName], I’m ready to assist you right away!⭐ I believe I'd be a great fit for your project since I have extensive experience in backend development and data pipelines. My technical skills in Node.js, Python, and PostgreSQL align perfectly with the requirements of building a reliable batch ingestion pipeline efficiently. I have successfully handled large datasets and optimized database performance to ensure scalability and reliability. Furthermore, I have a proven track record in developing clean and documented scripts that meet the project's specifications. My focus on backend development and database management makes me well-suited to deliver a solution that excels in batch insertion, providing proof of scalability and efficiency. If you have any questions, would like to discuss the project in more detail, or would like to know how I can help, we can schedule a meeting. Thank you. Maxim
$30 USD in 4 days
5.5
5.5

As a dedicated and experienced web development professional, I understand the incredible value of having a solid, reliable data pipeline when it comes to managing large-scale platforms. My expertise in languages like JavaScript and Node.js and my long-standing familiarity with database management (especially in PostgreSQL) makes me the perfect candidate for your Batch Ingestion Pipeline project. From my 9+ years in the field, I have had opportunities to work on projects that required processing huge datasets efficiently. I have consistently delivered clean, well-documented codebases that prioritize reliability and performance - essential attributes for a task of this nature. My proficiency with CSV input files and running scripts independently such as via terminals or cron will also ensure the seamless integration of the pipeline into your workflow. I believe in delivering value to my clients, both in terms of sound technology solutions and cost-effectiveness. My offer to provide ongoing support after project completion is just one project
$140 USD in 7 days
5.4
5.4

I can build a fast CLI-based CSV → PostgreSQL (Supabase) ingestion pipeline with true batch inserts, handling 20k+ records efficiently, using env-based DB config, clean logging, and full run instructions plus sample CSV + scale proof.
$140 USD in 1 day
4.9
4.9

With a deep proficiency in Python and extensive experience in backend development, specifically in managing large datasets and building data pipelines, I believe I am the ideal fit to fulfill your Batch Ingestion Pipeline project requirements. My expertise in Node.js, PostgreSQL along with a firm grasp on DB performance optimization will ensure seamless batch insertions of 10,000-20,000 records from CSV into your Supabase database. Having worked on numerous CLI-based scripts and cron jobs, I'm well-versed with designing reliable systems outside HTTP, something that aligns perfectly with one of the core constraints of this project. Furthermore, my ultimate focus on reliability rather than UI matches exactly what you require. I guarantee a clean, fully-documented script as output that comes with extensive instructions for easy local/server set up. I'll even provide you an example CSV format to assure code-interoperability. Overall, my skillset encompasses all the crucial aspects you need for this project - efficient handling of large data sets, deep understanding of database performance, and an unwavering commitment toward creating a highly reliable back end system tailored specifically to your needs. Sit back relaxed as I turn your vision of a simple but potent batch ingestion pipeline into an impeccable reality!
$30 USD in 3 days
4.6
4.6

I have built many ingestion pipelines like this using Python. For 20k records, simple looping is too slow, so I will implement a bulk insert approach (using Postgres COPY or batch execute) to get this done in seconds, not minutes. I can include a validation step to check the CSV structure before it touches your Supabase database. Does the script need to update existing records if a book ISBN already exists (upsert), or should it just skip duplicates?
$120 USD in 1 day
4.6
4.6

Hello, I can build a clean, reliable CLI-based batch ingestion script that reads large CSV files and inserts records efficiently into PostgreSQL using proper batch inserts. I have experience working on data pipelines and background jobs that handle tens of thousands of records with a focus on performance, reliability, and clear error handling. The script will run outside HTTP, use environment variables for configuration, and include simple documentation, an example CSV format, and proof of batch performance at scale. I’m comfortable using either Node.js or Python based on your preference. Are you available to proceed and confirm the stack choice?
$250 USD in 7 days
4.4
4.4

Hello! I’m a backend engineer with hands-on experience building batch ingestion pipelines and data-processing scripts that run reliably outside of HTTP (CLI, cron, workers). I’ve handled CSV-to-PostgreSQL pipelines at the 10k–100k record range, focusing on batch inserts, transaction safety, and predictable performance. For this task, I’d build a simple CLI script (Node.js or Python) that validates and streams CSV input, groups records into configurable batches, and inserts them efficiently into Supabase/Postgres using prepared statements or COPY where appropriate. The script will be idempotent-friendly, well-documented, and easy to run locally or on a server via env vars. No UI, no fluff—just a reliable ingestion job with clear instructions and proof it performs at scale. Best Regards!
$180 USD in 5 days
4.0
4.0

Hello, I can build your batch ingestion pipeline that reliably processes large datasets from source to destination with proper scheduling, error handling, logging, and performance optimization. I’ll ensure clean data workflows, scalability, and maintainable code so your backend can handle high-volume ingestion smoothly. Regards, Bharti
$140 USD in 7 days
4.0
4.0

With over 10 years of experience under my belt, I am confident that I have the necessary skills, expertise, and knowledge to execute this project effectively and efficiently. My proficiency in Node.js and Python are particularly well-suited to your project needs. Having created numerous batch jobs and data pipelines in the past, I am comfortable dealing with large datasets and possess a deep understanding of database performance. In addition to my technical capabilities, my emphasis on reliability meshes nicely with your project requirements. Although there may not be a UI for this particular assignment, it is important to have a strong underlying infrastructure that will perform reliably as you scale your platform. My extensive exposure to cloud services such as AWS will help ensure seamless deployments and optimized performance. Given my proven track record in designing agentic AI solutions that act autonomously while addressing specific business needs, I am both willing and eager to optimize content ingestion for your platform. Choose me, Rishi Kumarbecause as a seasoned professional, I can deliver on all fronts: excellent technical skills, extreme attention to detail and the ability to create future-proof solutions
$140 USD in 7 days
4.0
4.0

Hi, I reviewed your project about "Batch Ingestion Pipeline Development" and noticed that you're working with batch inserts into a PostgreSQL database via a CLI script. That tells me the main challenge here is ensuring efficient and reliable batch processing for large datasets without per-row insertion overhead. I’ve worked on similar backend data pipeline projects where I: - designed scalable backend APIs, - implemented secure authentication and data models, - and delivered production-ready web/mobile features. For your project, I’d suggest starting with a Node.js or Python script that reads CSV data into memory and performs PostgreSQL batch inserts using parameterized queries. This approach will minimize latency and avoid locking issues to keep the ingestion process performant at scale. Before moving forward, I have one quick question: Could you clarify the average row size and structure of your CSV files to optimize batch sizes and memory usage? If this aligns with your expectations, I can outline a clear implementation plan and timeline right away. Best regards, Nilo
$160 USD in 10 days
3.2
3.2

Hello, I’m a backend-focused engineer with strong experience building reliable data ingestion and batch processing systems, and your project scope is a great fit for my skill set. How I’ll Approach This I will build a CLI-based ingestion script that: Reads large CSV files containing book metadata Validates and parses data safely Performs efficient batch inserts into PostgreSQL (Supabase) Handles 10,000–20,000+ records without performance issues Runs independently via terminal, cron job, or worker process No UI, no unnecessary layers — just a clean, reliable backend pipeline
$165 USD in 2 days
2.4
2.4

Hi, I came across your project and would like to offer my services to you. From my understanding, you need a a terminal commandline based solution to import CSV data to your database table. I can help you create a script - bash/R/Python to import the CSV data to database table. I believe you have a robust data ingestion solution that checks for key constraints and do an upsert with audit trails instead of direct insertion of data. Kindly contact me to talk more about your requirement.
$140 USD in 7 days
2.6
2.6

Hello, thanks for posting this project. I've read your requirements and this batch ingestion pipeline is a great fit for my skills. My experience is centered around designing robust backend data pipelines and optimizing large-scale insertions in PostgreSQL, particularly for metadata or catalog platforms. I'm fully comfortable building CLI-based scripts in Node.js or Python, using best practices for batch operations, and ensuring the solution is both efficient and easy to maintain. Clean documentation and actionable setup instructions will be provided, as well as clear evidence of the pipeline's performance at scale. Could you share an example schema or sample CSV headers so I can better align the ingestion logic with your data structure? Looking forward to hearing from you. Warm regards, Vitalii.
$140 USD in 1 day
2.2
2.2

Hi there, I understand you need a backend engineer to build a CSV to PostgreSQL batch ingestion pipeline. Reliability and efficiency are key. Here's my approach: * CLI-based Node.js script (leveraging Express or Fastify for CLI if needed). * Efficient batch inserts into Supabase using PostgreSQL's optimized features. * Robust error handling & logging. * Detailed documentation, example CSV, and proof of scale. * First draft/prototype within 2 days #change * All source files included * Unlimited revisions Why choose me: I have specific expertise in backend systems and database engineering with Node.js and PostgreSQL. Experience building data pipelines. Quick questions: 1. Is the CSV format fixed? 2. What are the key performance metrics? I can start immediately. Let's discuss the details. Best regards, Team Mactix - Custom Software Dev
$30 USD in 7 days
2.4
2.4

With a dedicated focus on backend engineering and over 10 years of professional experience, I bring a comprehensive skill set that aligns excellently with your project needs. I have a deep understanding of Node.js and JavaScript, which will be indispensable in building your CLI-based ingestion script. My proficiency in handling large datasets and optimizing database performance will ensure an efficient transfer of 10,000-20,000 records into the PostgreSQL database. In addition to my technical skills, I pride myself on my commitment to reliability and clean documentation. I fully appreciate the nuanced nature of your project - prioritizing batch inserts and eschewing per-row inserts - and can assure you that my scripts are always well-documented and easily understandable for seamless integration into your existing infrastructure. Furthermore, my team believes in delivering not just code but also viable solutions to the specific challenges faced by businesses. We understand the importance of deadlines and are adept at agile project management, guaranteeing timely completion without compromising on quality. In making your decision, I urge you to consider both my proven technical capabilities paired with my cohesive approach to problem-solving as an investment into a truly successful pipeline development experience.
$140 USD in 7 days
2.0
2.0

Hello, I can build a CLI-based batch ingestion script exactly as described — focused purely on backend data processing, not UI. What I’ll deliver: Script (Python or Node.js) to read CSV book metadata Efficient batch inserts into PostgreSQL (Supabase) Handles 10,000–20,000+ records reliably Runs via terminal / cron / worker Uses environment variables for DB config Approach: Chunked CSV reading (memory-safe) Batched inserts (COPY / execute_batch) Proper transactions & error handling Deliverables: Clean, documented script Example CSV format Run instructions (local & server) Proof of batch insertion at scale No frontend, no UI, no real-time processing — exactly within scope. Happy to start with a quick POC or discuss Python vs Node.js. Best regards, Shubham
$200 USD in 2 days
2.0
2.0

Hi, I’m a backend engineer experienced in building reliable data pipelines and batch ingestion scripts. I can create a CLI-based script in Python or Node.js that reads CSV files with book metadata and performs efficient batch inserts into PostgreSQL (Supabase) for 10,000–20,000 records, without per-row operations. The script will be clean, documented, configurable via environment variables, and easily runnable from terminal, cron, or a worker process. I’ll provide instructions, example CSVs, and proof of large-scale insertion. My focus is on performance, reliability, and maintainability, and I’m comfortable handling large datasets while ensuring database efficiency. Best Regards, Kainat
$40 USD in 7 days
1.7
1.7

São Paulo, Brazil
Member since Jan 24, 2026
$250-750 USD
$250-750 USD
$1500-3000 USD
₹500000-1000000 INR
$30-250 USD
$250-750 USD
$10-30 USD
$10-30 USD
$750-1500 USD
$250-750 USD
₹12500-37500 INR
$2-8 USD / hour
$10-30 USD
€8-30 EUR
$30-250 CAD
₹12500-37500 INR
₹600-1500 INR
$50-450 NZD
£5-10 GBP / hour
$30-250 USD
$10-30 USD