Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
1 - 1 Lacs
Chennai
Work from Office
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Senior Location: Chennai Work Type: Hybrid Position Description: Solid experience designing, building, and maintaining cloud-based data platforms and infrastructure. Deep proficiency in GCP Cloud Services, including significant experience with Big Query, Cloud Storage, Data Proc, APIGEE, Cloud Run, Google Kubernetes Engine (GKE), Postgres, Artifact Registry, Secret Manager, and Access Management (IAM). Hands-on experience implementing and managing CI/CD Pipelines using tools like Tekton and potentially Astronomer. Strong experience with Job Scheduling and workflow orchestration using Airflow. Proficiency with Version Control systems, specifically Git. Strong programming skills in Python. Expertise in SQL and experience with relational databases like SQL Server, MY SQL, Postgres SQL. Experience with or knowledge of data visualization tools like Power BI. Familiarity with code quality and security scanning tools such as FOSSA and SonarQube. Foundational Knowledge on Artificial Intelligence and Machine Learning concepts and workflows. problem-solving skills and the ability to troubleshoot complex distributed systems. Strong communication and collaboration skills. Knowledge of other cloud providers (AWS, Azure, GCP) Skills Required: GCP , Big Query,, AI/ML Experience Required: 4+ years Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.
Posted 4 weeks ago
10.0 - 15.0 years
20 - 35 Lacs
Pune
Work from Office
About the Position This role involves leveraging advanced data technologies to drive business outcomes in the media and advertising sector, focusing on scalability and performance optimization. Successful candidates will be adept at collaborating across teams to implement comprehensive data strategies and ensure robust, secure data management. Technical and Professional Requirements Candidate should have good experience into the following tech stack: Big Query / Big Data / Hadoop / Snowflake Programming / Scripting Languages like Python / Java / Go Hands on experience in defining and implementing various Machine Learning models for different business needs. Hands on experience in handling various performance and data scalability problems. Knowledge in Advertising Domain and varied experience in working on different data sets such as Campaign Performance, Ad Performance, Attribution, Audience Data etc. Job Responsibilities: Designing and implementing an overall data strategy as per business requirements. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems. Identification of data sources, internal and external, and defining a plan for data management as per business data strategy. Collaborating with cross-functional teams for the smooth functioning of the enterprise data system. Managing end-to-end data architecture, from selecting the platform, designing the technical architecture, and developing the application to finally testing and implementing the proposed solution. Planning and execution of big data solutions using Big Query, Big Data, Hadoop, Snowflake. Integrating technical functionality, ensuring data accessibility, accuracy, and security. Educational Requirements: Any Graduate with 60% and above
Posted 1 month ago
9.0 - 14.0 years
22 - 37 Lacs
Pune
Hybrid
We're Hiring: Senior GCP Data Engineer (L4) for a client || Immediate joiners only Location: Pune | Walk-in Drive: 5th July 2025 Are you a seasoned Data Engineer with 912 years of experience and a passion for building scalable data solutions on Google Cloud Platform? Join us for an exciting walk-in opportunity! Key Skills Required GCP Data Engineering, BigQuery, SQL Python (Cloud Compressor, Cloud Functions, Python Injection) Dataproc + PySpark, Dataflow + Pub/Sub Apache Beam, Spark, Hadoop What You'll Do Architect and implement end-to-end data pipelines on GCP Work with BigQuery, BigTable, Cloud Storage, Spanner, and more Automate data ingestion, transformation, and augmentation Ensure data quality and compliance across systems Collaborate in a fast-paced, dynamic environment Bonus Points Google Professional Data Engineer or Solution Architect certification Experience with Snaplogic, Cloud Dataprep Strong SQL and data integration expertise If interested, Pls share your CV @ Raveena.kalra@in.ey.com
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Mumbai, Bengaluru, Delhi / NCR
Work from Office
Must have skills required: GCP, support, Python Forbes Advisor is Looking for: Role Summary We are seeking a proactive and detail-oriented Data Support Engineer- to monitor production processes, manage incident tickets, and ensure seamless operations in our data platforms. The ideal candidate will have experience in Google Cloud Platform (GCP), Airflow, Python and SQL with a strong focus on enabling developer productivity and maintaining system reliability. Key Responsibilities: Production Monitoring: Monitor and ensure the smooth execution of production data pipelines and workflows. Identify and promptly address anomalies or failures in the production environment. Perform first-level investigation for issues, leveraging logs and monitoring tools. Incident Management: Create and manage tickets for identified production issues, ensuring accurate documentation of details and impact analysis. Assign tickets to the appropriate development teams and follow up to ensure timely resolution. Communication of incidents within the Data Team. Platform Support: Participate in daily standup and team meetings and contribute to platform improvement initiatives. Contribute to enhancing the platform to streamline development workflows and improve system usability. Required Skills: Bachelors degree with Minimum 1 year of experience working in supporting the production pipelines. Proficiency in SQL for debugging tasks. Familiarity with incident management tools like JIRA. Strong communication skills to interact with cross-functional teams and stakeholders. Good to have: Hands-on experience with Google Cloud Platform (GCP) services like BigQuery. Strong understanding of Apache Airflow and managing DAGs. Basic understanding of DevOps practices and automating CI/CD pipelines. Python Proficiency Note: This role requires candidates to work in UK timings. Saturday and Sunday will be working. Rotational off will be provided. Qualifications Bachelors degree in full time.
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS , Spark , Hive , Sqoop Strong Python experience Hands on SQL , HQL to write optimized queries Strong hands-on experience with GCP Big Query , Data Proc , Airflow DAG , Dataflow , GCS , Pub/sub , Secret Manager , Cloud Functions , Beams . Ability to work in fast passed collaborative environment work with various stakeholders to define strategic optimization initiatives. Deep understanding of distributed computing, memory turning and spark optimization. Familiar with CI/CD workflows, Git . Experience in designing modular , automated , and secure ETL frameworks .
Posted 1 month ago
4.0 - 9.0 years
11 - 19 Lacs
Chennai
Work from Office
Role & responsibilities Python, Dataproc, Airflow PySpark, Cloud Storage, DBT, DataForm, NAS, Pubsub, TERRAFORM, API, Big Query, Data Fusion, GCP, Tekton Preferred candidate profile Data Engineer in Python - GCP Location Chennai Only 4+ Years of Experience
Posted 1 month ago
6.0 - 9.0 years
19 - 25 Lacs
Hyderabad
Hybrid
Role & responsibilities Driving project delivery proactively, balancing planning, scope, schedule, budget, communications and risks. Managing and planning resources, responsibilities and schedules Establishing effective project controls and procedures and quality assurance processes. Managing relationships with internal and external stakeholders Reporting progress, issues, dependencies and risks to project or programme leadership and committees (as appropriate) and making recommendations to influence decision making, in order to maintain progress towards delivery and benefits realisation Providing management updates to maintain a focus on how the project aligns to wider programme objectives, where appropriate, and to the change portfolio across HSBC Driving the adoption of HSBC project standards and working in alignment with HSBC project methodology at all times. Leading the team to meet performance targets aligned to objectives Preferred candidate profile Project management office experience on status, issues, risk and dependencies reporting Proven track record of successful project delivery with quantifiable business benefits. Mature and independent able to work with teams with minimal supervision. Excellent written and verbal communicator. Proactive builder of strong and diverse business relationships. Experience of core HSBC Finance tools (e.g. TM1, Saracen, etc.) Proficient in Microsoft Office applications (Word, Excel, Visio, PowerPoint, Teams).
Posted 1 month ago
8.0 - 10.0 years
25 - 27 Lacs
Chennai
Work from Office
Role & responsibilities Python, Data Warehousing, Big Query, SQL, AIRFLOW, GCP
Posted 1 month ago
8.0 - 13.0 years
8 - 13 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a Technical Architect specializing in Healthcare Data Analytics with expertise in Google Cloud Platform (GCP). The role involves designing and implementing data solutions tailored to healthcare analytics requirements. The ideal candidate will have experience in GCP tools like BigQuery, Dataflow, Dataprep, and Healthcare APIs, and should stay up to date with GCP updates. Knowledge of healthcare data standards, compliance requirements (e.g., HIPAA), and healthcare interoperability is essential. The role requires experience in microservices, containerization (Docker, Kubernetes), and programming languages like Python and Spark. The candidate will lead the implementation of data analytics solutions and collaborate with cross-functional teams, data scientists, and engineers to deliver secure, scalable systems.
Posted 1 month ago
2.0 - 4.0 years
4 - 9 Lacs
Chennai
Work from Office
Role & responsibilities React, JavaScript, Application Support, Big Query, Application Testing, Application Design, Coding, Angular, SPRING, Application Development, Developer, Java, Web Services
Posted 1 month ago
4.0 - 9.0 years
14 - 20 Lacs
Hyderabad
Work from Office
Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 4 to 5 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and implement data and semantic interoperability specifications. Work closely with business teams to define and scope requirements. Analyze existing systems to identify appropriate data sources and drive continuous improvement. Implement and continuously enhance automation processes for data ingestion and data transformation. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications: Overall 4-5 years of hands-on experience as a Data Engineer, with at least 3 years of direct GCP Data Engineering experience . Strong SQL and Python development skills are mandatory. Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies. Demonstrated knowledge and experience with Google Cloud BigQuery is a must. Experience with DataProc and Dataflow is highly preferred. Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks . Extensive experience in SQL across various database platforms. Experience in data mapping and data modeling . Familiarity with data analytics tools and best practices. Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell . Practical experience with Google Cloud services including but not limited to: Big Query , BigTable Cloud Dataflow , Cloud Data proc Cloud Storage , Pub/Sub Cloud Functions , Cloud Composer Cloud Spanner , Cloud SQL Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark ). Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker , etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. GCP Data Engineer Certification is highly preferred. Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Total Experience: Relevant Experience : Current Role / Skillset: Current CTC: Fixed: Variables(if any): Bonus(if any): Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: Willing to work from office: ************* 5DAYS WORK FROM OFFICE MANDATORY ****************
Posted 1 month ago
4.0 - 9.0 years
14 - 20 Lacs
Hyderabad
Work from Office
Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 4 to 5 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and implement data and semantic interoperability specifications. Work closely with business teams to define and scope requirements. Analyze existing systems to identify appropriate data sources and drive continuous improvement. Implement and continuously enhance automation processes for data ingestion and data transformation. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications: Overall 4-5 years of hands-on experience as a Data Engineer, with at least 3 years of direct GCP Data Engineering experience . Strong SQL and Python development skills are mandatory. Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies. Demonstrated knowledge and experience with Google Cloud BigQuery is a must. Experience with DataProc and Dataflow is highly preferred. Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks . Extensive experience in SQL across various database platforms. Experience in data mapping and data modeling . Familiarity with data analytics tools and best practices. Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell . Practical experience with Google Cloud services including but not limited to: Big Query , BigTable Cloud Dataflow , Cloud Data proc Cloud Storage , Pub/Sub Cloud Functions , Cloud Composer Cloud Spanner , Cloud SQL Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark ). Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker , etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. GCP Data Engineer Certification is highly preferred. Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Total Experience: Relevant Experience : Current Role / Skillset: Current CTC: Fixed: Variables(if any): Bonus(if any): Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: Willing to work from office: ************* 5DAYS WORK FROM OFFICE MANDATORY ****************
Posted 1 month ago
3.0 - 5.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Primary skills- GCP, Python CODING MUST, SQL Coding skills, Big Query, Dataflow, Airflow, Kafka and Airflow Dags . Bachelors Degree or equivalent experience in Computer Science or related field Required- Immediate or 15 days Job Description 3+ years experience as a software engineer or equivalent designing large data-heavy distributed systems and/or high-traffic web-apps Experience in at least one programming language (Python-2 yrs strong coding is must) or java. Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies Experience designing & interacting with APIs (REST/GraphQL) Experience working with cloud platforms such as GCP, Big Query Experience in DevOps processes/tooling (CI/CD, GitHub Actions), using version control systems (Git strongly preferred), and working in a remote software development environment Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment Preferred: Experience using infrastructure as code frameworks (Terraform) Preferred: Experience using big data tools such as Spark/PySpark Preferred: Experience using or deploying MLOps systems/tooling (eg. MLFlow) Must have: Experience in pipeline orchestration (eg. Airflow) Must Have Experience in Data Flow 1 yr experience atleast Preferred: Experience using infrastructure as code frameworks (Terraform) Preferred: Experience in an additional programming language (JavaScript, Java, etc) Preferred: Experience using data science/machine learning technologies.
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Faridabad
Work from Office
Job Summary We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities Design and implement scalable data models using Snowflake and Erwin Data Modeler. Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. Ensure performance tuning, security, and optimization of the Snowflake data warehouse. Document metadata, data lineage, and business logic behind data structures and flows. Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills Snowflake architecture, schema design, and data warehouse experience. DBT (Data Build Tool) for data transformation and pipeline development. Strong expertise in SQL (query optimization, complex joins, window functions, etc.). Hands-on experience with Erwin Data Modeler (logical and physical modeling). Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have Experience with CI/CD tools and DevOps for data environments. Familiarity with data governance, security, and privacy practices. Exposure to Agile methodologies and working in distributed teams. Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills Excellent problem-solving and analytical skills. Strong communication and stakeholder management. Self-driven with the ability to work independently in a remote setup. Skills: gcp,erwin,dbt,sql,data modeling,dbeaver,bigquery,query optimization,dataflow,cloud storage,snowflake,erwin data modeler,data pipelines,data transformation,datamodeler
Posted 1 month ago
4.0 - 9.0 years
0 - 2 Lacs
Chennai
Hybrid
Job Description: We are seeking a skilled and proactive GCP Data Engineer with strong experience in Python and SQL to build and manage scalable data pipelines on Google Cloud Platform (GCP) . The ideal candidate will work closely with data analysts, architects, and business teams to enable data-driven decision-making. Key Responsibilities: Design and develop robust data pipelines and ETL/ELT processes using GCP services Write efficient Python scripts for data processing, transformation, and automation Develop complex SQL queries for data extraction, aggregation, and analysis Work with tools like BigQuery, Cloud Storage, Cloud Functions , and Pub/Sub Ensure high data quality, integrity, and governance across datasets Optimize data workflows for performance and scalability Collaborate with cross-functional teams to define and deliver data solutions Monitor, troubleshoot, and resolve issues in data workflows and pipelines Required Skills: Hands-on experience with Google Cloud Platform (GCP) Strong programming skills in Python for data engineering tasks Advanced proficiency in SQL for working with large datasets Experience with BigQuery , Cloud Storage , and Cloud Functions Familiarity with streaming and batch processing (e.g., Pub/Sub , Dataflow , or Dataproc )
Posted 1 month ago
3.0 - 6.0 years
5 - 10 Lacs
Mumbai
Work from Office
Job Summary: We are seeking a highly skilled MLOps Engineer to design, deploy, and manage machine learning pipelines in Google Cloud Platform (GCP). In this role, you will be responsible for automating ML workflows, optimizing model deployment, ensuring model reliability, and implementing CI/CD pipelines for ML systems. You will work with Vertex AI, Kubernetes (GKE), BigQuery, and Terraform to build scalable and cost-efficient ML infrastructure. The ideal candidate must have a good understanding of ML algorithms, experience in model monitoring, performance optimization, Looker dashboards and infrastructure as code (IaC), ensuring ML models are production-ready, reliable, and continuously improving. You will be interacting with multiple technical teams, including architects and business stakeholders to develop state of the art machine learning systems that create value for the business. Responsibilities: Managing the deployment and maintenance of machine learning models in production environments and ensuring seamless integration with existing systems. Monitoring model performance using metrics such as accuracy, precision, recall, and F1 score, and addressing issues like performance degradation, drift, or bias. Troubleshoot and resolve problems, maintain documentation, and manage model versions for audit and rollback. Analyzing monitoring data to preemptively identify potential issues and providing regular performance reports to stakeholders. Optimization of the queries and pipelines. Modernization of the applications whenever required Qualifications: Expertise in programming languages like Python, SQL Solid understanding of best MLOps practices and concepts for deploying enterprise level ML systems. Understanding of Machine Learning concepts, models and algorithms including traditional regression, clustering models and neural networks (including deep learning, transformers, etc.) Understanding of model evaluation metrics, model monitoring tools and practices. Experienced with GCP tools like BigQueryML, MLOPS, Vertex AI Pipelines (Kubeflow Pipelines on GCP), Model Versioning & Registry, Cloud Monitoring, Kubernetes, etc. Solid oral and written communication skills and ability to prepare detailed technical documentation of new and existing applications. Strong ownership and collaborative qualities in their domain. Takes initiative to identify and drive opportunities for improvement and process streamlining. Bachelors Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications: Experience in Azure MLOPS, Familiarity with Cloud Billing. Experience in setting up or supporting NLP, Gen AI, LLM applications with MLOps features. Experience working in an Agile environment, understanding of Lean Agile principles.
Posted 1 month ago
6.0 - 10.0 years
15 - 30 Lacs
Pune, Maharashtra, India
On-site
We are looking for an experienced GCP and BigQuery professional to join our team in India. The ideal candidate will have a solid background in data engineering and analytics, with expertise in designing scalable data solutions on the Google Cloud Platform. Responsibilities Design, develop and maintain scalable data pipelines using Google Cloud Platform (GCP) and BigQuery. Analyze and interpret complex datasets to provide actionable insights to stakeholders. Collaborate with data engineers and analysts to optimize data storage and retrieval processes. Implement data quality checks and ensure the accuracy of data in BigQuery. Create and manage dashboards and reports to visualize data findings effectively. Stay up-to-date with the latest developments in GCP and BigQuery to leverage new features for business needs. Skills and Qualifications 6-10 years of experience in data engineering or analytics with a focus on Google Cloud Platform (GCP) and BigQuery. Strong proficiency in SQL and experience with BigQuery optimizations. Experience with ETL tools and data pipeline orchestration (e.g., Apache Airflow, Cloud Dataflow). Familiarity with programming languages such as Python or Java for data manipulation and analysis. Knowledge of data modeling, data warehousing concepts, and best practices. Understanding of data privacy and security standards in cloud environments.
Posted 1 month ago
5.0 - 10.0 years
8 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
Google Cloud Platform GCS, DataProc, Big Query, Data Flow Programming Languages Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform, GCS, DataProc, Big Query, Data Flow, Composer, Data Processing, Java
Posted 1 month ago
4.0 - 9.0 years
10 - 20 Lacs
Gurugram
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain ETL processes using SSIS to extract data from various sources. Develop complex SQL queries to retrieve data from relational databases such as SQL Server. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Troubleshoot issues related to ETL process failures or performance problems. Ensure compliance with security standards by implementing Denodo Platform for data masking. Desired Candidate Profile 4-9 years of experience in ETL development with expertise in Agile methodology. Strong understanding of .NET Core, C#, Microsoft Azure, Big Query, SSRS (SQL Reporting Services), SSIS (SQL Server Integration Services). B.Tech/B.E. degree in Any Specialization. Hands-on experience in Database (MS SQL Server, Big Query, Denodo) Experience in NET / Visual Studio (SSRS ,SSIS & ETLs Package) Good Knowledge in requirement elicitation From workshops/meetings to Agile board epic/features/stories
Posted 1 month ago
7.0 - 10.0 years
0 Lacs
Pune, Chennai, Bengaluru
Work from Office
As a GCP data engineer the colleague should be able to designs scalable data architectures on Google Cloud Platform, using services like Big Query and Dataflow. They write and maintain code (Python, Java), ensuring efficient data models and seamless ETL processes. Quality checks and governance are implemented to maintain accurate and reliable data. Security is a priority, enforcing measures for storage, transmission, and processing, while ensuring compliance with data protection standards. Collaboration with cross-functional teams is key for understanding diverse data requirements. Comprehensive documentation is maintained for data processes, pipelines, and architectures. Responsibilities extend to optimizing data pipelines and queries for performance, troubleshooting issues, and proactively monitoring data accuracy. Continuous learning is emphasized to stay updated on GCP features and industry best practices, ensuring a current and effective data engineering approach. Experience - Proficiency in programming languages: Python, Pyspark - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud Composer , Cloud Spanner, GCS, DBT etc., - Data Engineering skillset using Python, SQL - Experience in ETL (Extract, Transform, Load) processes - Knowledge of DevOps tools like Jenkins, GitHub, Terraform is desirable. Should have good knowledge on Kafka (Batch/ streaming) - Understanding of Data models and experience in performing ETL design and build, database replication using Message based CDC - Familiarity with cloud storage solutions - Strong problem-solving abilities in data engineering challenges - Understanding of data security and scalability - Proficiency in relevant tools like Apache Airflow Desirables Knowledge of data modelling and database design Good understanding of Cloud Security Proven practical experience of using the Google Cloud SDK to deliver APIs and automation Crafting continuous integration and continuous delivery/deployment tooling pipelines (Jenkins/Spinnaker)
Posted 1 month ago
8.0 - 13.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Qualification & Experience: Minimum of 8 years of experience as a Data Scientist/Engineer with demonstrated expertise in data engineering and cloud computing technologies. Technical Responsibilities Excellent proficiency in Python, with a strong focus on developing advanced skills. Extensive exposure to NLP and image processing concepts. Proficient in version control systems like Git. In-depth understanding of Azure deployments. Expertise in OCR, ML model training, and transfer learning. Experience working with unstructured data formats such as PDFs, DOCX, and images. O Strong familiarity with data science best practices and the ML lifecycle. Strong experience with data pipeline development, ETL processes, and data engineering tools such as Apache Airflow, PySpark, or Databricks. Familiarity with cloud computing platforms like Azure, AWS, or GCP, including services like Azure Data Factory, S3, Lambda, and BigQuery. Tool Exposure: Advanced understanding and hands-on experience with Git, Azure, Python, R programming and data engineering tools such as Snowflake, Databricks, or PySpark. Data mining, cleaning and engineering: Leading the identification and merging of relevant data sources, ensuring data quality, and resolving data inconsistencies. Cloud Solutions Architecture: Designing and deploying scalable data engineering workflows on cloud platforms such as Azure, AWS, or GCP. Data Analysis : Executing complex analyses against business requirements using appropriate tools and technologies. Software Development : Leading the development of reusable, version-controlled code under minimal supervision. Big Data Processing : Developing solutions to handle large-scale data processing using tools like Hadoop, Spark, or Databricks. Principal Duties & Key Responsibilities: Leading data extraction from multiple sources, including PDFs, images, databases, and APIs. Driving optical character recognition (OCR) processes to digitize data from images. Applying advanced natural language processing (NLP) techniques to understand complex data. Developing and implementing highly accurate statistical models and data engineering pipelines to support critical business decisions and continuously monitor their performance. Designing and managing scalable cloud-based data architectures using Azure, AWS, or GCP services. Collaborating closely with business domain experts to identify and drive key business value drivers. Documenting model design choices, algorithm selection processes, and dependencies. Effectively collaborating in cross-functional teams within the CoE and across the organization. Proactively seeking opportunities to contribute beyond assigned tasks. Required Competencies: Exceptional communication and interpersonal skills. Proficiency in Microsoft Office 365 applications. Ability to work independently, demonstrate initiative, and provide strategic guidance. Strong networking, communication, and people skills. Outstanding organizational skills with the ability to work independently and as part of a team. Excellent technical writing skills. Effective problem-solving abilities. Flexibility and adaptability to work flexible hours as required. Key competencies / Values: Client Focus : Tailoring skills and understanding client needs to deliver exceptional results. Excellence : Striving for excellence defined by clients, delivering high-quality work. Trust : Building and retaining trust with clients, colleagues, and partners. Teamwork : Collaborating effectively to achieve collective success. Responsibility : Taking ownership of performance and safety, ensuring accountability. People : Creating an inclusive environment that fosters individual growth and development.
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Technical and Professional Requirements: Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->GCP Data Analytics->Looker Technology->Cloud Platform->GCP Database->Google BigQuery
Posted 1 month ago
3.0 - 6.0 years
8 - 14 Lacs
Bengaluru
Work from Office
- Minimum of 3 years of hands-on experience. - Python/ML, Hadoop, Spark : Minimum of 2 years of experience. - At least 3 years of prior experience as a Data Analyst. - Detail-oriented with a structured thinking and analytical mindset. - Proven analytic skills, including data analysis, data validation, and technical writing. - Strong proficiency in SQL and Excel. - Experience with Big Query is mandatory. - Knowledge of Python and machine learning algorithms is a plus. - Excellent communication skills with the ability to be precise and clear. - Learning Ability : Ability to quickly learn and adapt to new analytic tools and technologies. Key Responsibilities : Data Analysis : - Perform comprehensive data analysis using SQL, Excel, and Big Query. - Validate data integrity and ensure accuracy across datasets. - Develop detailed reports and dashboards that provide actionable insights. - Create and deliver presentations to stakeholders with clear and concise findings. - Document queries, reports, and analytical processes clearly and accurately. - Leverage Python/ML for advanced data analysis and model development. - Utilize Hadoop and Spark for handling and processing large datasets. - Work closely with cross-functional teams to understand data requirements and provide analytical support. - Communicate findings effectively and offer recommendations based on data analysis. Education : Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Experience : Minimum of 3 years of experience as a Data Analyst with a strong focus on SQL, Excel, and Big Query. Technical Skills : Proficiency in SQL, Excel, and Big Query; experience with Python, ML, Hadoop, and Spark is preferred.
Posted 1 month ago
8.0 - 10.0 years
20 - 30 Lacs
Chennai
Hybrid
Role & responsibilities GCP Services - Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Preferred candidate profile Python, dataflow, Dataproc, GCP Cloud Run, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA,Java. We would like to inform you that only immediate joiners will be considered for this position due to project urgency.
Posted 1 month ago
7.0 - 12.0 years
20 - 35 Lacs
Pune
Work from Office
Experience of 7+ years and have hands on experience on JAVA , GCP Shell script and Python knowledge a plus. Have in depth knowledge on Java, Spring boot Experience in GCP Data Flow, Big Table, Big Query etc
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough