BigqueryJobs
...strictes pour la réalisation du projet. Architecture & Stack Application Mobile (B2C) : Développement Cross-Platform (Flutter ou React Native) pour iOS/Android. Plateforme Web (B2B) : Application Web React.js ou Vue.js avec tableau de bord analytique. Backend & Data : Python (Django/FastAPI) pour la logique algorithmique. Base de Données : Architecture type "Data Warehouse" (PostgreSQL + BigQuery/Snowflake) capable de gérer des millions de lignes de transactions. Fonctionnalités Critiques à Développer Le Pipeline ETL (Extract-Transform-Load) : Script automatique qui nettoie et anonymise la donnée brute de l'App B2C vers le Data Warehouse B2B. Impératif : Suppression irréversibl...
...Google BigQuery. My current priority is data querying and retrieval, specifically: • Basic SELECT statements • Join operations • Aggregate functions with GROUP BY I’d like a mentor who can demonstrate each concept directly in BigQuery, assign short practice tasks, and review my work so I see where to improve. Explanations should stay clear and practical, gradually adding optimisation tips and BigQuery-specific features once the core techniques sink in. By the end of our time together I should be able to write clean SELECTs, build reliable joins, summarise data confidently, and understand how query plans reveal efficiency issues. A small set of completed exercises stored in a shared repo will serve as proof of progress. If you have expe...
...re-gridded to 1 km, stored in GCS & BigQuery partitioned by date. Training pipeline (Vertex AI custom job) Model: Temporal-Fusion-Transformer (PyTorch Lightning) – encoder 12 days, decoder 60 days. Physics-aware loss (optionally embed ERA5 precipitation as residual). Hyper-parameter search to beat ECMWF HRES RMSE by ≥ 15 % on held-out 2023 global data. Target skill: ≥ 80 % relative MAE reduction vs. baseline climatology (we will verify with independent rain-gauge split). Serving layer FastAPI container, autoscaling Vertex AI Endpoint (GPU T4, min 0). /predict – JSON in {“cube_id”:int, “past_seq”:[[float]]} → {“precip_mm_day”:[60 values]} latency < 200 ms p99. /health + automatic CI/CD via Cloud Build. Infrastructur...
...can refresh without manual steps. On the Alpaca side I need clean, version-controlled code that signs, submits, and monitors orders through their REST and streaming APIs, then routes any relevant events back into the app. Key deliverables I will be checking against: • A working code sample or module that authenticates to Google Cloud, proves access (for example, pulling a test dataset from BigQuery or writing to Cloud Storage), and can be dropped into my app. • A parallel module that logs in to Alpaca, places a dummy trade in paper mode, streams order updates, and surfaces the response to the UI. • Clear, step-by-step documentation covering credential creation, environment variables, and any third-party libraries, so I can reproduce the setup on another mach...
...into user behaviour, not just traffic counts or SEO metrics. The goal is to understand exactly how visitors move from page to page—where they start, the routes they take, where they hesitate, and where they drop off. In short, the focus is purely on user-flow analysis of our website data. You are free to work in Python (pandas, numpy, scikit-learn), R, or even directly inside Google Analytics / BigQuery if that speeds things up. Feel free to visualise flows with tools such as Tableau, Power BI, Looker Studio, or custom D3/Sankey charts—whatever makes the navigation paths crystal-clear. Deliverables • A cleaned, documented dataset ready for future reuse • Clear visualisations of the navigation flow (e.g., Sankey or funnel diagrams) • A concise re...
I need a Looker Studio (formerly Data Studio) specialist who can devote 30 consecutive business days to my account, working 6:00 AM–4:00 PM CST. During that window you’ll be hands-on inside the platform, unifying data from our CRM system, paid and organic social media channels, sales database, Google properties stored in BigQuery, and several custom APIs. Once the connections are stable and refreshing on schedule, the focus shifts to insight: you’ll design clear, role-based dashboards that highlight the metrics we run the business on—conversion rate, monthly recurring revenue, campaign-level marketing analytics, performance scorecards, and sales pipeline metrics. These views must update automatically and remain intuitive for executives and marketers who wo...
...through our approved dealer network or HomeCars Charities. ⸻ Step 3: Close & Drive Forward Finalize your purchase, receive your credit, and drive away with value already built in. No lenders. No approvals. Just transparency and opportunity. ⸻ Technical Requirements for Developer Backend • Scalable cloud infrastructure (AWS / GCP / Azure) • Database optimized for large datasets (PostgreSQL, BigQuery, or similar) • API-first architecture • Role-based authentication • High-availability and redundancy Frontend • Clean, modern UI • Fast loading • SEO-friendly • Responsive (mobile + desktop) • Clear dashboards for each user type Automation • Automated credit calculation engine • Automated document gener...
...through our approved dealer network or HomeCars Charities. ⸻ Step 3: Close & Drive Forward Finalize your purchase, receive your credit, and drive away with value already built in. No lenders. No approvals. Just transparency and opportunity. ⸻ Technical Requirements for Developer Backend • Scalable cloud infrastructure (AWS / GCP / Azure) • Database optimized for large datasets (PostgreSQL, BigQuery, or similar) • API-first architecture • Role-based authentication • High-availability and redundancy Frontend • Clean, modern UI • Fast loading • SEO-friendly • Responsive (mobile + desktop) • Clear dashboards for each user type Automation • Automated credit calculation engine • Automated document gener...
...Console, Metricool, Office 365 y BigQuery) y consolidará la información en un tablero ejecutivo en Stratio con indicadores clave de tráfico, comportamiento, ROI y ROAS. ALCANCE DEL PROYECTO Diseño e implementación del DataLayer (Capa de Datos): Configuración técnica en Google Tag Manager. Definición de variables y eventos personalizados (hasta 5 funnels). Configuración de funnels y trazabilidad externa (Office 365 / links): Medición de formularios externos y redirecciones fuera del dominio principal. Trazabilidad entre fuentes y registro de eventos externos. Integración de plataformas analíticas y publicitarias: Conexión y validación técnica de GA4, Clarity, Search Console y ...
...Builder internals and limitations Tool calling and function schemas (advanced patterns) File, browser, code, retrieval and custom tool integration Prompt layering: system → developer → agent memory Guardrails, refusal handling, and safety controls Cost optimization and latency trade-offs 3. Gemini Agent Builder / Vertex AI Agents Gemini agent architecture vs OpenAI agents Native connectors (BigQuery, GCS, Google Drive, APIs) Tool invocation and grounding with enterprise data Differences in memory, reasoning, and orchestration Strengths/weaknesses vs OpenAI agents 4. Connectors & Data Integration (Critical) Designing scalable connectors (APIs, databases, SaaS tools) Retrieval-augmented agents vs tool-based agents Sync vs async data flows Security, per...
...read those three parameters on the ESP32 at a sensible sampling rate, push them securely over Wi-Fi, and have them land in my Google Cloud account for long-term storage and future dashboarding. Although my original note mentioned AWS, I’ve decided to proceed with Google Cloud instead, so please use the tools you feel most comfortable with there (IoT Core, Pub/Sub, Cloud Functions, Firestore, BigQuery—whatever keeps the solution simple and maintainable). TLS encryption and device authentication are mandatory. I’ll provide: • The hardware (ESP32, energy-meter IC, power supply) • Wi-Fi credentials for bench testing • Access to a fresh Google Cloud project What I need back: • Fully commented ESP32 firmware (Arduino or ESP-IDF) that cap...
...using Dataflow, BigQuery, and Cloud Storage to feed data into search indices. - Fine-tune ranking, indexing, and relevance using features like Vector Search (Matching Engine), embeddings, and semantic understanding. - Design "agentic" AI systems that can perform complex, multi-step tasks across data stores. - Configure search features like autocomplete, facets, and recommendations for our e-commerce and social media. Required Technical Skills: - Proficiency in Google Cloud Platform (GCP) and the Vertex AI suite (Search & Conversation, Vertex AI Studio, and Model Garden). - Mastery of customizing search functionality via APIs and SDKs. - Strong understanding of LLMs (Gemini), prompt engineering, tokenization, and vector-based information retrieval. - Experience wit...
I'm seeking an experienced Google Cloud Platform analyst, specifically for BigQuery and cloud fusion. I need help getting knowledge on how to map and transform data in GCP. Key Requirements: - In-depth analysis of BigQuery - Insights on performance, and data mapping. Ideal Skills & Experience: - Extensive experience with BigQuery - Proven track record in cloud data analysis - Ability to provide actionable insights Please include relevant past work in your application.
I need a fully-automated analytics stack that funnels data from GoHighLevel, Stripe, Google Analytics 4, and Whop into BigQuery and then visualises the key KPIs in Looker Studio. The warehouse should be designed primarily for reporting—clean, well-modelled tables that refresh on a reliable schedule—so I can quickly slice everything from acquisition costs to sales-team close rates without wrestling with raw exports. Scope of work • Build or configure the ETL pipelines that extract each source’s data, load it into BigQuery, and transform it into a unified schema. If an off-the-shelf connector (Fivetran, Airbyte, native GA4 export, etc.) is the best option, set it up; if custom Cloud Functions or SQL is more practical, document the logic clearly. &b...
In this milestone, I will develop the initial data pipeline for AVM and set up the BigQuery schema according to the mandatory architecture. This includes: Creating BigQuery tables with partitioning by assessedYear and clustering by zipCode, propertyType, and yearBuilt. Writing a cloud-native Python script to process raw parcel data, historical sales data, and external API inputs. Ensuring that the data pipeline is clean, modular, and ready for integration with the computation engine in the next milestone. Deliverables will include the script, BigQuery table structure, and a brief document explaining the setup and how to run it.
...Geolocation + proximity engine. Coupon & gamification rules engine. Event management. Business directory management. Incident/SOS management. Notification engine (email, SMS, WhatsApp, web push). Microservices for independent scaling (SOS, maps, notifications). 3.3 Data & AI Layer Front-end: Vue. Operational DB: Supabase (users, sessions, scans, events, coupons, SOS). Data warehouse (BigQuery/Redshift/Snowflake). ETL/ELT for logs ingestion, cleaning, anonymization. AI models: recommendations, predictive heatmaps, under-activated zone detection, anomaly detection (security/fraud). 3.4 Integration Layer Municipality systems: licenses, tourism, culture, security, civil protection. Emergency & medical: C5/911, Red Cross, hospitals (API/web console). Third ...
I'm seeking an expert to assess and recommend a new data warehouse and machine learning platform to replace our current SAS Base system. Id...running data and licenses) - Migration time frame - Training/skills required - Ability to deploy/develop dedicated model etc Key Requirements: - Handle data storage, ETL, visualization, reporting, and ML modeling - Blend seamlessly with AI tools - Scalability, performance, and ease of integration are critical - Open to various vendors (AWS, Google Cloud, Azure, Databricks, Snowflake, BigQuery, Azure Viya) Ideal Skills: - Experience with modern data platforms - Strong knowledge of ETL processes and ML - Familiarity with AI integration - Vendor comparison expertise Your insights will help us choose the right platform...
...it every day, so every millisecond counts. The core stack is Ruby on Rails and Go running on both AWS and GCP. Data flows through event-driven pipelines into DynamoDB and BigQuery and finally lands in email templates that you will own end-to-end. Here is what I need built or improved right now: • New and refactored micro-services in Go or Rails that can fan out millions of alerts without breaking latency targets. • Health-checks, metrics and auto-scaling hooks wired into CloudWatch, Stackdriver or your preferred observability tool so we always know where we stand. • Robust connectors for DynamoDB streams and BigQuery jobs to keep the data pipeline real-time. • A small library of maintainable, parametrized email templates that marketing can twea...
...running. The previous developer completed the setup and delivered working scrapers, schedules, and a BigQuery connection. I need a developer to review the build, stabilize it, and finish the remaining pieces. What is already done: Apify organization created Two working Actors running on schedule Facebook Posts Scraper Instagram Profile Scraper Existing saved tasks Storage dataset creation Service account permissions set BigQuery dataset ready Tables receiving data daily What I need completed: Confirm Actors run without failure Add retry logic and error handling Clean and structure the scraped data Write transformations for sentiment and keyword flags Load cleaned data into final BigQuery tables Build a Looker Studio dashboard using these tables Doc...
...implementing automation workflows, data analytics, and conversion funnels. We currently have a Webflow-based website that includes a custom-built Learning Management System (LMS) and use Memberstack for authentication and gated content. We manage back-end automations primarily through Integromat (Make) and store data in Airtable. All user data is tracked and analyzed via Google Analytics and stored in BigQuery for deeper insights. We also rely on a Node.js backend, hosted on AWS, for complex operations and custom logic that other no-code tools can't handle. For search, we sync Webflow content with Elasticsearch for fast, scalable search capabilities. Additionally, we leverage a custom AI assistant powered by GPT for course content interactions, and we utilize SendGrid for...
CA civil rights law firm is building an internal tool that auto‑captures billable legal work from Google Workspace, Slack, Zoom, and Google Voice, drafts proposed billing entries to an accept reject/edit user interface, and posts approved items to our timekeeping software, Clio. ...time. • Review UI: Approve / Edit / Split / Reject with evidence links and audit trail. • Clio integration: create Activities on approval; prevent duplicates. Relevant experience would include: • Google Workspace APIs (Calendar watch, Drive Activity, Gmail watch, Meet/Reports), Zoom REST, Slack Events, Clio API v4. • GCP (Cloud Run/Functions, Pub/Sub, Secret Manager/KMS, Firestore/BigQuery), OAuth 2.0, RBAC, logging. • Strong security hygiene; can document scopes, ...
...(including crawling, indexing, internal linking, rendering, JavaScript crawlability and indexability), ✓ Following up Google algorithms and updates closely; the ability to analyze impact and translate them into actionable strategies, ✓ Actively integrating AI tools (LLMs, content/analyze/automation) in workflows, ✓ Advanced knowledge of GSC, GA4, Ahrefs, Screaming Frog and similar tools; basic SQL/BigQuery understanding, ✓ Analytical and result oriented mindset; the ability to break down complex problems and solve, ✓ Strong communication skills and the ability to work with content, SEO and software development teams. What We Expect from You in This Role ✓ Develop and lead the application of long term actionable SEO strategies for all company projects, ✓ Analyze SEO performance ...
CA civil rights law firm is building an internal tool that auto‑captures billable legal work from Google Workspace, Slack, Zoom, and Google Voice, drafts proposed billing entries to an accept reject/edit user interface, and posts approved items to our timekeeping software, Clio. ...time. • Review UI: Approve / Edit / Split / Reject with evidence links and audit trail. • Clio integration: create Activities on approval; prevent duplicates. Relevant experience would include: • Google Workspace APIs (Calendar watch, Drive Activity, Gmail watch, Meet/Reports), Zoom REST, Slack Events, Clio API v4. • GCP (Cloud Run/Functions, Pub/Sub, Secret Manager/KMS, Firestore/BigQuery), OAuth 2.0, RBAC, logging. • Strong security hygiene; can document scopes, ...
I need a hands-on mentor who can take me from absolute beginner to confident BigQuery user in just seven days. My priority is to get comfortable writing efficient SQL queries inside BigQuery, understanding how the underlying storage works, and seeing how those queries translate into real business insights on live data. Here’s the rhythm I have in mind: daily one-on-one video calls (60-90 min each) where you walk me through practical scenarios drawn from your own production experience. After every call I’ll work through the same dataset on my side, then we review the results at the start of the next session. By the end of the week I should be able to design a schema, load data, build and test complex queries, and explain the cost implications of each step. What I...
...prepares data and pushes it into Google BigQuery, but the code has grown messy. I want it refactored so that any developer can open the file and understand what’s happening at a glance, then watch it load cleanly into my BigQuery table. Here’s what I need from you: • Put readability first. Rename variables so they instantly convey intent, weave concise yet helpful inline and block comments throughout, and reorganize the code into neat, logical sections or functions. • Stay faithful to PEP 8 standards and keep an eye out for anything that could trip up future performance, but speed tuning isn’t the main goal—clarity is. • Verify the end-to-end pipeline: authenticate with my service-account JSON, run the script, and confirm rows ...
I have an existing Python data-processing script that reads structured files (mostly CSV pulled from SQL dumps) and pushes the results into Google BigQuery. It works, but it’s messy. I’m looking for someone to go through the codebase, iron out the bugs that occasionally stop a run, streamline the logic so it performs faster, and then extend it with a couple of small features I’ve been postponing. Right now the script: • ingests source files from local disk or Cloud Storage • performs several Pandas transformations and a few custom calculations • uploads the final table into a dedicated BigQuery dataset What I need from you: • Debug existing errors so every run completes without manual tweaks • Refactor for clearer structure a...
...On Facebook pages we need posts, comments, likes, shares, user profiles, comment timestamps, anonymised user IDs and full engagement counts. Instagram should follow the same logic for public posts. Everything has to run automatically each day inside Apify, with retry logic and clear logging. Data flow & analysis Raw JSON or CSV should land in an Apify dataset, flow into a durable database (BigQuery, Postgres or similar) and trigger analysis. The analysis layer must surface sentiment (positive / negative / neutral), engagement trends, keyword frequency, issue clustering and narrative mapping, influencer and account impact tracking, plus early-signal virality detection. Feel free to draw on open-source NLP libraries—Hugging Face Transformers, spaCy, Vader, or your pref...
Busco a un profesional con dominio de Google Cloud Console y las APIs de Google Maps (Places, Geocoding o BigQuery GIS) para realizar un estudio de ubicación de gabinetes radiológicos dentales y clínicas odontológicas en la Ciudad de México y el Estado de México. El objetivo es detectar patrones de concentración, zonas con alta competencia y áreas de oportunidad para la instalación de nuevos gabinetes radiológicos. El trabajo consiste en crear un mapa interactivo (preferentemente en Google My Maps o una plataforma similar) que muestre los puntos exactos de cada gabinete radiológico y clínica dental detectada, con capas diferenciadas por tipo de negocio. También necesito un archivo estructurad...
...workflows. Support in performance tuning, debugging, and data migration activities. Offer guidance on data architecture, best practices, and real-time project execution. Technical Skills Required (any of the following): Programming: Python, SQL, PySpark ETL Tools: Apache Airflow, Talend, Informatica, or similar Cloud Platforms: AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse), or GCP (BigQuery) Databases: PostgreSQL, Snowflake, MySQL, MongoDB Big Data Technologies: Hadoop, Spark, Databricks (preferred) Version Control / CI-CD: Git, Jenkins Ideal Candidate: Has 10+ years of experience in data engineering and related technologies. Strong in troubleshooting, architecture design, and real-time project handling. Flexible for US shift hours if required. Excellent c...
I have a Python script that currently processes real estate data and I need it streamlined before pushing the results into Google BigQuery. Here’s what I want done: • Remove any unused functions so the file stays lean. • Optimize the functions we actually use—speed and readability both matter. • Refactor variable and function names to a clear, consistent style. Once the code is tidy, finish by loading the processed data into my existing BigQuery and Github dataset. I’ll provide repository access, sample data, and the target table schema. A quick walkthrough of what you changed plus instructions to rerun the pipeline on my side will wrap things up.
We're a healthcare business looking to build automated dashboards in Looker Studio. Our data currently lives in Google Sheets, but the spreadshee...business looking to build automated dashboards in Looker Studio. Our data currently lives in Google Sheets, but the spreadsheets have become slow and hard to manage. We want to: - Pull data automatically from multiple sources via API keys (Rethink, Rippling, Google Ads) - Store and process data in BigQuery - Build dashboards for key metrics like clinician and site performance, billable vs admin time, retention, and ad spend - Allow limited data overrides when needed - Experience with Looker Studio, BigQuery, and healthcare analytics is preferred. - Must be available in CST timezone. - Responsive communication (quick replies...
Initial requirement I want to turn our raw operational data in...easy-to-read Power BI dashboard that zeroes in on workflow performance. The data is imported from excel file Future Requirement All tables already live in Google BigQuery, so experience with the Power BI ↔ BigQuery connector—or at least solid SQL skills for BigQuery—will speed things up Here’s what I need: • Connect to the existing BigQuery tables (or the extracts I’ll provide) • Publish the finished report to the Power BI Service and configure user access. Deliverables A fully documented .pbix file with all DAX measures. Acceptance criteria (Not immediate requirement) – All numbers reconcile with the underlying BigQuery queries. Se...
Data Scientist (SQL / Business Intelligence) Company Overview: We are an Outsourcing / Contract Specialist Team that supports clients with enterprise-grade analytics and business intelligence systems. We are seeking a Data Scientist focused on SQL, dashboarding, and strategic insights. Job Summary: The cand...metrics. Maintain data accuracy, security, and governance standards. Requirements: 3+ years of experience in data analytics, BI, or data science. Advanced SQL skills and familiarity with ETL workflows. Experience with data visualization tools. Strong business acumen and communication skills. Preferred Skills: Experience with Python for automation or light scripting. Knowledge of Snowflake, BigQuery, or Redshift. Compensation: $40–$75 per hour or $85,000–...
I am hiring a senior iOS developer (preferably a team) skilled in SwiftUI and Firebase/Cloud to build a feature-complete confidential application. The app integrates AI-driven modules, analytics, and secure data wo...reporting • Optimize app performance, latency, and accessibility • Integrate AI modules using secure and efficient cloud endpoints • Prepare CI/CD pipelines and handover documentation ⸻ Required Skills: • Swift 5+, SwiftUI, Combine, AVFoundation, Accelerate, Vision, Metal/MPS • Firebase (Auth, Firestore, Functions, Storage, Remote Config, Analytics) • Google Cloud (Cloud Run, Pub/Sub, Scheduler, BigQuery, Secret Manager) • TypeScript or React for admin panel • Experience with AI or RAG model integration • DevOps (Fastlane, ...
...and proof-of-deletion through logs and metrics. * Implement safeguards—dry runs, approvals, dependency checks—to prevent unintentional data loss. * Collaborate with compliance and security teams on data governance and regulatory requirements (GDPR, CCPA, etc.). What You’ll Bring * Proven experience architecting data platforms on AWS, GCP, or Azure using object stores, warehouses (Snowflake, BigQuery, Redshift), or lakehouse frameworks (Databricks, Delta Lake, Apache Iceberg, Hudi). * Strong background in data lifecycle management, row-level deletes, and partition strategies. * Hands-on experience with workflow orchestration and policy-as-code tools (Apache Kafka, Airflow, OPA, Rego, etc.). * Knowledge of data governance, privacy, and compliance requirements re...
I’m ready to move an existing data set into Google BigQuery and need a specialist who can handle the full migration quickly and reliably. Your task is to: • Assess my current source data (format and volume are straightforward—think CSV and JSON files stored in cloud storage). • Design and execute the import into BigQuery, setting up the optimal dataset structure, table partitions, and clustering where it makes sense. • Validate that all records have transferred accurately and perform spot checks on row counts and schema integrity. • Provide a concise hand-off note describing the steps you took and any ongoing maintenance tips. Timing is critical—I’d like this wrapped up as soon as possible. When you apply, focus on your B...
...proactively. Maintain clear documentation of cloud architecture and processes. Requirements: 8+ years in cloud/infra, with 4+ years hands-on in GCP. Experience of LiveKit, Twilio, Gemini, and a homegrown TTS is preferred. Proven experience as a Cloud Architect (with DevOps/automation background). Strong knowledge of GCP services (Compute Engine, GKE, Cloud Storage, Cloud Functions, Pub/Sub, BigQuery, IAM). Proficiency with DevOps/IaC tools: Terraform, Jenkins, Git, Docker, Kubernetes, Ansible. Solid understanding of cloud networking, security, and governance. Experience with scripting/automation (Python, Bash, etc.). Familiarity with monitoring tools (Prometheus, Grafana, Google Cloud Monitoring). Certification (Google Cloud Professional Architect) preferred. Notes: Background...
...that surfaces my priority KPIs (for ex: Profit Margins, Sales Volume, Ecommerce Analytics, and Geographics—using Bar charts, Line graphs, Pie charts, and simple Tables.) • Enable real-time updates for WooCommerce data so the numbers stay current without manual refreshes. • Create email or Slack alerts that notify me in some scenarios. • Allow for optional data pulls: GA4 traffic housed in BigQuery and Meta Ads spend imported via weekly or monthly CSV. Current Setup WooCommerce (multiple sites): orders, products, returns, tax amounts per country (there is different currencies). A custom Laravel return portal: we should use this source for return requested SKU's to see most returning SKU's. COGS: maintained in the TR Woo store in TRY (SKU-level...
...images, run colour conversion, apply classification rules, and publish updated mapping tables. • Scale seamlessly as new brands, shades, and swatch batches are added. • Maintain reliability with monitoring, logging, and alerting so the pipeline runs hands-off. Preferred stack & suggested components • GCP services such as Cloud Storage (raw images), Cloud Functions / Cloud Run (processing), BigQuery (analytical tables), and Composer or Dataflow for orchestration. • Python remains the execution language; modular, well-documented code is essential. • CI/CD via Cloud Build or GitHub Actions. Deliverables 1. Architecture diagram and setup scripts (Terraform or Deployment Manager). 2. Refactored, containerised Python modules ready for Cl...
...description of the collection methodology (API endpoints, libraries, rate-limit strategies, or archival sources) and a data dictionary should accompany the delivery. JSON, CSV, or Parquet are all acceptable provided the schema is consistent and well documented. I am happy to grant temporary API keys, but if you already have access—Academic Research API, GNIP archives, Meltwater, custom TikTok scrapers, BigQuery social data stores, etc.—please let me know. Provenance is critical, so every record must be traceable back to its original pull. Acceptance criteria • Coverage: at least 24 consecutive months for both X and TikTok • Fields: everything listed above present and non-null for >95 % of records • Format: one zipped archive per platform plus...
I need a production-ready customer analytics dashboard built in Looker Studio that pulls live data from both BigQuery and Google Sheets. The work starts with translating my business goals and KPI definitions into a clean data model, then designing an intuitive layout that highlights customer acquisition, engagement, and retention metrics at a glance. You will connect the two data sources, set up reliable refresh schedules, and apply visualization best practices so the dashboard loads quickly and tells a clear story. Interactive filters, drill-downs, and period-over-period comparisons should be configured out of the box. If a RegEx tweak or a touch of CSS/JavaScript helps prototype a smarter component, feel free to use it—those skills are welcome but not essential. The delive...
I run several web servers on Google Cloud Platform and rely solely ...Specific recommendations—rightsizing, autoscaling, instance scheduling, possible switch to Spot VMs or predefined machine types—each with estimated savings. • A concise action plan I can follow (or have you execute later) to implement the changes safely. You’ll get read-only access to the project, analyze current utilization with tools like Cloud Monitoring, Recommender, and Billing export to BigQuery if needed, then walk me through the findings on a short call. No changes should be applied directly during the audit phase; I first want to understand every suggestion. If you’ve slashed Compute Engine costs for other clients and can point to real percentages saved, I’m eager ...
... • Provide a slim demo (code snippets or a small repo) that ingests sample events and surfaces live metrics developers care about—active sessions, feature clicks, error spikes, and latency. • Outline success criteria: engagement uplift signals to watch, alert thresholds, and a simple way to A/B test improvements. I’m open to the stack you prefer—whether that’s Firebase, AWS Kinesis, Google BigQuery, or a lean Node.js/Python pipeline—so long as setup is straightforward for a small team. A user-friendly interface and responsive design would be nice to have, but the must-have is the real-time insight itself. Deliverables 1. Architectural diagram (PDF or image) 2. Annotated source (Git repo or ZIP) with quick-start instructions 3. One...
...freelancer to build a lightweight but robust real-time data engineering and dashboard system. The system should: Ingest streaming data (simulated or from external sources). Connect to both on-premises storage and cloud platforms (e.g., Azure Data Lake or Blob Storage). Prioritize real-time data processing to handle continuous streams efficiently. Integrate with data warehouses such as Snowflake, BigQuery, or Redshift for scalable data management. Provide dynamic, interactive visualizations using tools like Streamlit or Power BI. Be responsive, reliable, and easy to deploy. Deliverables: Clean and documented code (Python preferred) Database schema and pipeline setup Real-time dashboard with auto-refresh Deployment instructions Budget: $30 – $250 USD (Micro Proje...
...actionable insight. The core objective is straightforward: take the dataset as-is, clean it, explore it, and surface the key patterns that will guide my next decisions. You’ll work with whatever stack you’re fastest in—Python (pandas, NumPy, scikit-learn), R, SQL, or even a BI platform like Power BI or Tableau—so long as the final output is easy for me to digest and reuse. Cloud tools such as BigQuery or Snowflake are available if local processing won’t cut it. Deliverables • A cleaned, well-documented dataset ready for future queries • An analytical report highlighting the most significant sales trends and correlations • Interactive visuals or dashboards that let me slice the data on my own Accuracy, transparency on methods, ...
...answer routine queries, route complex calls to live agents, and keep transcripts neatly logged in our CRM. Think Twilio or Amazon Connect for telephony, Dialogflow or a comparable NLU engine for intent detection, and seamless fallback logic to maintain a human-grade experience. • Data analysis – Once calls, chats, and other customer interactions are flowing in, I want automated pipelines (Python, BigQuery or similar) to aggregate, label, and visualise trends. I should be able to open a dashboard and instantly see sentiment, common pain points, and campaign performance without manual number-crunching. • Marketing campaigns – Using the insights above, the next step is automated audience segmentation and multichannel outreach. If you’re comfortable tyin...
...than real-time speed. Here’s how I picture the workflow: at the start of each month an automated job reaches into Helium 10 (API, browser automation, or another proven method—whatever is most robust) and exports the latest sales snapshot. The script then cleans the raw output and drops a neatly structured CSV or Google Sheet into a shared drive. If you can also push the results straight into BigQuery or Snowflake, even better, but a flat file is the minimum I need. Deliverables • A documented script or small app that logs in, extracts revenue, units sold, and average selling price, and saves them in a tabular format • Step-by-step setup instructions so I can run the job on my own machine or a cloud function • A quick test run showing last month&...
...constant pulse on my active campaigns. The single place of truth should pull live data from Google Ads, Google Analytics, and our network of Google Business Profiles, blend it cleanly, and surface the three numbers my team cares about most—click-through rate, conversion rate, and cost per acquisition. Here’s how I picture it working: • Seamless, scheduled connections to the three sources above (BigQuery or direct connectors—whichever keeps refreshes fast and reliable). • A clear top-level snapshot for yesterday, last 7 days, and month-to-date, broken out by platform. • Drill-downs that let me filter by campaign, ad group, location, or date range without breaking the visuals. • Thoughtful use of scorecards, trend lines, and heat-maps so...
Dispongo de una base de datos extensa y heterogénea y necesito convertirla en un activo publicitario sólido. La prioridad inmediata es Google Ads; allí quiero trabajar con ...recomendaciones de optimización continua. Criterios de aceptación – Al menos tres segmentos diferenciados validados con un ROAS o CPL superior al baseline histórico. – Etiquetado funcionando y enviando eventos sin pérdida de datos (>95 % de coincidencias). – Informe final que incluya aprendizajes, próximos pasos y tabla de rendimiento por audiencia. Trabajo con acceso a Google Ads, GA4, BigQuery y hojas de cálculo; si utilizas otras herramientas compatibles no hay problema. Mientras cumplamos con la calidad del dato ...
...Requirements: - Proven experience with Google Tag Manager, GA4, and Enhanced E-commerce implementations. - Ability to debug and test DataLayer pushes. - Understanding of consent mode and cookie compliance. - Clear documentation and communication in English or German. Bonus: - Experience with WooCommerce or headless setups (WordPress + Nuxt/Vue frontend). - Familiarity with server-side tagging or GA4 BigQuery exports. Important: This is not just a tag placement job — we need a complete, reliable analytics foundation that can be extended in the future. More events may be added later by the developer as we refine the user journey and marketing attribution....