screens: - login firebase - upload files - validate by schema - advanced ralationship validation - export SQL tablets frontend: angular 6 + ant design backend: node + express + aws s3 persist csv database: mongodb + mongodb import csv
...working on our website which provides a highly-scalable SaaS platform to clients to facilitate marketing campaigns. It contains three primary components: - a large, searchable database of influential media profiles (140,000,000 and growing), along with detailed profiles on each. - campaign management (which is comparable to a CRM combined with a more complicated
...as in the customers’ data centres. The position length is 1 year with a possible extension. The candidate will be on probation for the first 3 months. Solid experience in MS SQL 2008 to 2014 (5+ years) and proven data integration experience....
400$ per month, 8 hrs per day . (12 months work) Looking for full time developer, good opportunity for startup company or pure freelancer. will prefer someone from mumbai can give extra cash and he can work in our office Start immediately , add 400$ as first word while applying so I know you have read the
...video with mover than 500 products to be listed, describe, titled, priced and image to be all collected in a database for uploading on a website. can you do the conversion from a normal video recording to a complete listing. I need the database to be on excel not to be uploaded online i will do the the upload, all images will not be taken from the video
...EETewVPzg7N75 Database • To an existing webpage, set up the linked database structure: [log ind for at se URL] • Requirement on the database is that it can be mass administrated (insert/delete/update) by us to contain 100s of videos (Load with csv-file or similar, and using SQL statements to update/delete
I have a system where users are evaluating against different criteria's and save/modify them on demand basis. The data set (stored in SQL Server tables) they evaluate against is about 100K+ records. It is taking time intensive process unless I evaluate first and save them against their criteria. Only bid if you have solution or have worked on similar
We have built a custom portal that uses a database and has some integrations with [log ind for at se URL], MySQL database, etc. There are many files that our web developer used but we cannot figure out how to make changes to it because he has since left the organization.
The solution I am after requires a data engineering development which should be able to query a postgreSQL database of 20 million records (2 crores). In the ideal scenario, the query should return 100,000 records under 5 seconds. I am looking for someone who is experienced in Apache Spark and able to show a proof of concept before developing a full
We have a data science project we are working on and are looking for a database of as many addresses in UK as we can find. They can be residential or commercial, we just need valid combinations of Address/zip etc. They have to all pass Google's autofill API check. (Just make sure they are real) Looking for as much data as we can get. Millions of records