Job Overview


Job Overview

  • Job ID:

    3583

  • Job Title:

    GCP Data Architect

  • Location:

    San Jose, CA

  • Duration:

    6 Months + Extension  

  • Hourly Rate:

    Depending on Experience (DOE)

  • Work Authorization:

    US Citizen, Green Card, OPT-EAD, CPT, H-1B, H4-EAD, L2-EAD, GC-EAD

  • Client:

    To Be Discussed Later

  • Employment Type:

    W-2, 1099, C2C

Role: GCP Data Architect
Bill Rate: $78/hour C2C
Location: San Jose, CA
Duration: 12+ months/ long-term
Interview Criteria: Telephonic + Teams
Direct Client Requirement

Job Description:
We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment.

Core Responsibilities:
Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development.
Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics
Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.

Required Skills & Experience:
Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically:
BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.
Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub
Programming & Querying:
Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries
SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.
Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar).
DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.

NOTE: Thank you for visiting our jobs page. Please submit your application using the Apply Now link. Our recruitment team is currently reviewing all applications thoroughly. We will be in touch with candidates who are shortlisted for the next stage of the interview process.

Valiant Technologies LLC
166 Geary St
San Francisco, CA 94108
Phone: (415) 935-9966
srinivasa.kandi@valianttec.com

Tags: Srinivasa Reddy Kandi, #SrinivasaReddyKandi, @SrinivasaReddyKandi, Srinivasa Kandi, #SrinivasaKandi, @SrinivasaKandi, Kandi Srinivasa Reddy, #KandiSrinivasaReddy, @KandiSrinivasaReddy 



Apply Now

Equal Opportunity Employer

We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate based on race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.