GCP Data Engineer

LTIMindtree
Apply Now

Job Description

Role: GCP Data Engineer

Primary Skills

  • GCP Data Architecture and designing streaming & Batch pipelines.
  • Implementing Big Data solutions leveraging GCP cloud data technologies with extensive experience in ingestion, processing, and transformation.
  • Proficient in Python, PySpark, relational and NoSQL databases and GCP data technologies such BigQuery, Dataproc, Dataflow, Data Fusion, Cloud Composer, PubSub, DataPrep, Dataplex
  • Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery.
  • Looker studio & Operations suite (Cloud monitoring and Logging).
  • Excellent relationship management skills and leadership skills.

Good to have:

  • Experience on open-source or commercial Modern Data Stack tools such as Airbyte, Fivetran, dbt, Monte Carlo, CDAP, etc. is an ad added advantage
  • Flask FastAPI Development
  • Google certified Data Engineer

Role Description:

  • Hands-on experience architecting, designing and implementing Big Data solutions leveraging GCP cloud data technologies; extensive experience in the area of ingestion, processing and transformation
  • Proficient in Python, PySpark, relational and NoSQL databases and GCP data technologies such BigQuery, Dataproc, Dataflow, Data Fusion, Cloud Composer, PubSub, DataPrep, Dataplex
  • Experience on open-source or commercial Modern Data Stack tools such as Airbyte, Fivetran, dbt, Monte Carlo, CDAP, etc. is an ad added advantage.
  • Good Analytics skill is needed on issue identification and resolution.
  • Experience in distributed data processing, performance tuning
  • Experience in maintenance & enhancement projects.
  • Complete ownership of the tasks and deliverable
  • Good communication skills.
  • Flexibility to support customer across the Geos and time zones.
  • Ensures consistency of process and usage, and champions best practices in data management Oversees data accuracy processes, goals and assessment.
  • Ensures resolution of data conflicts between systems and within systems’ data universe.
  • Works with internal stakeholders to develop strategies for leveraging data to gain a deeper insight into IIE’s business, impact on international education, and IIE’s story.
  • Ensures rigorous adherence to IIE policies regarding PII protection and data security. Oversees data expiry practices and establishes and implements processes for data archival and expiry.
  • Manages and develops the staffing of the Data & Reporting unit; champions team collaboration; monitors and develops career paths within the unit. Make recommendations concerning employment, termination, performance evaluations, salary actions, and other personnel actions.
  • Responsible for user experience with data and reporting tools.
  • Member of the Knowledge Management Cabinet.
  • Excellent relationship management skills
  • Familiarity working and collaborating with teams from various geographies culture and time zones

Primary Skills

  • GCP Data Architecture and designing streaming & Batch pipelines.
  • Implementing Big Data solutions leveraging GCP cloud data technologies with extensive experience in ingestion, processing, and transformation.
  • Proficient in Python, PySpark, relational and NoSQL databases and GCP data technologies such BigQuery, Dataproc, Dataflow, Data Fusion, Cloud Composer, PubSub, DataPrep, Dataplex
  • Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery.
  • Looker studio & Operations suite (Cloud monitoring and Logging).
  • Excellent relationship management skills and leadership skills.

Good to have:

  • Experience on open-source or commercial Modern Data Stack tools such as Airbyte, Fivetran, dbt, Monte Carlo, CDAP, etc. is an ad added advantage
  • Flask FastAPI Development
  • Google certified Data Engineer

Role Description:

  • Hands-on experience architecting, designing and implementing Big Data solutions leveraging GCP cloud data technologies; extensive experience in the area of ingestion, processing and transformation
  • Proficient in Python, PySpark, relational and NoSQL databases and GCP data technologies such BigQuery, Dat

How will you grow?

  • Role-based Training programs
  • Continuing Education Programs (CEP) to enhance your knowledge, skills, and attitude as a professional
  • We encourage you to acquire various beneficial international certifications, with costs s reimbursed
  • Our role-based workshop helps us groom future leaders for LTI

What's in it for you?

  • Excellent benefits plan: medical, dental, vision, life, FSA, & PTO
  • Roll over vacation days
  • Commuter benefits
  • Excellent growth and advancement opportunities
  • Certification reimbursement
  • Rewards and recognition programs
  • Innovative and collaborative company culture

We are an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristic protected by law.


Nearest Major Market: Cincinnati
Job Segment: Database, Open Source, Relationship Manager, User Experience, SQL, Technology, Customer Service

Company Info.

LTIMindtree

LTIMindtree Limited is an Indian multinational information technology services and consulting company based in Mumbai. A subsidiary of Larsen & Toubro, the company was incorporated in 1996 and employs more than 84,000 people. It is one of the top Big Tech (India) companies.

  • Industry
    Information Technology,Consulting
  • No. of Employees
    84,500
  • Location
    Mumbai, Maharashtra, India
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

LTIMindtree is currently hiring GCP Data Engineer Jobs in Cincinnati, OH, USA with average base salary of $90,000 - $190,000 / Year.

Similar Jobs View More