Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10 - 15 years
0 - 0 Lacs
Chennai
Work from Office
About the Role As a Senior Data Engineer you’ll be a core part of our engineering team. You will bring your valuable experience and knowledge, improving the technical quality of our data-focused products. This is a key role in helping us become more mature, deliver innovative new products and unlock further business growth. This role will be part of a newly formed team that will collaborate alongside data team members based in Ireland, USA and India. Following the successful delivery of some fantastic products in 2024, we have embarked upon a data-driven strategy in 2025. We have a huge amount of data and are keen to accelerate unlocking its value to delight our customers and colleagues. You will be tasked with delivering new data pipelines, actionable insights in automated ways and enabling innovative new product features. Reporting to our Team Lead, you will be collaborating with the engineering and business teams. You’ll work across all our brands, helping to shape their future direction. Working as part of a team, you will help shape the technical design of our platforms and solve complicated problems in elegant ways that are robust, scalable, and secure. We don’t get everything right first time, but you will help us reflect, adjust and be better next time around. We are looking for people who are inquisitive, confident exploring unfamiliar problems, and have a passion for learning. We don’t have all the answers and don’t expect you to know everything either. Our team culture is open, inclusive, and collaborative – we tackle goals together. Seeking the best solution to a problem, we actively welcome ideas and opinions from everyone in the team. Our Technologies We are continuously evolving our products and exploring new opportunities. We are focused on selecting the right technologies to solve the problem at hand. We know the technologies we’ll be using in 3 years’ time will probably be quite different to what we’re using today. You’ll be a key contributor to evolving our tech stack over time. Our data pipelines are currently based upon Google BigQuery, FiveTran and DBT Cloud. These involve advanced SQL alongside Python in a variety of areas. We don’t need you to be an expert with these technologies, but it will help if you’re strong with something similar. Your Skills and Experience This is an important role for us as we scale up the team and we are looking for someone who has existing experience at this level. You will have worked with data driven platforms that involve some kind of transaction, such as eCommerce, trading platforms or advertising lead generation. Your broad experience and knowledge of data engineering methods mean you’re able to build high quality products regardless of the language used – solutions that avoid common pitfalls impacting the platform’s technical performance. You can apply automated approaches for tracking and measuring quality throughout the whole lifecycle, through to the production environments. You are comfortable working with complex and varied problems. As a strong communicator, you work well with product owners and business stakeholders. You’re able to influence and persuade others by listening to their views, explaining your own thoughts, and working to achieve agreement. We have many automotive industry experts within our team already and they are eager to teach you everything you need to know for this role. Any existing industry knowledge is a bonus but is not necessary. This is a full-time role based in our India office on a semi-flexible basis. Our engineering team is globally distributed but we’d like you to be accessible to the office for ad-hoc meetings and workshops.
Posted 1 month ago
4 - 8 years
0 - 0 Lacs
Chennai
Work from Office
Who we are looking for: Monk is leading the way in AI-driven innovation with its advanced damage detection technology for vehicles, enabling seamless integration into web and native applications via APIs. Our machine learning expertise is trusted by global leaders like our parent company ACV Auctions (USA), CAT Logistics (Europ Getaround (France), Autobiz (Europe), and Hgreg (Canad We are looking for a Machine Learning Scientist in Computer Vision who thrives on taking ownership, delivering impactful results, and driving innovation. If you're someone who can align with business needs, ensure timely delivery, and elevate the team's capabilities, we’d love to have you on board. This role is your opportunity to lead game-changing projects, create tangible value, and shape the future of AI technology in the automotive space. What you will do: Own the end-to-end development of machine learning models and datasets to solve critical business challenges and meet product needs. Drive innovation by continuously tracking and integrating state-of-the-art advancements of computer vision and deep learning into Monk’s AI solutions Identify and solve key business problems with data-driven, actionable insights. Deliver high-impact results through meticulous planning, rigorous testing, and smooth deployment of machine learning solutions. Mentor and support junior team members, cultivating a culture of excellence and collaboration. Machine Learning Scientist CV Collaborate with product, business, and technical teams to align on priorities, estimate efforts, and adapt to feedback. Ensure long-term reliability and scalability through robust documentation and testing practices. Tech Stack Utilized: Languages: Python, SQL Libraries/Frameworks: PyTorch ecosystem, a touch of OpenCV and Scikit-learn Tools: Jupyter, DBT, Docker Infrastructure: GCP, Kubernetes, BigQuery Version Control: GitHub Background and Skills: Required Skills 5+ years of experience in computer vision, data science or related fields, wit a proven track record of delivering impactful results. Strong foundation in machine learning, computer science, statistical modeling, and data processing pipelines. Proficiency in Python and SQL for data manipulation and model development. Solid experience deploying machine learning models into production environments. A proactive approach to aligning with business needs and driving team-wide innovation. Strong communication skills to explain technical concepts to non-technical stakeholders. Fluent in English, enabling effective collaboration across global teams. Desired Background Master’s or Ph.D. in Data Science, Computer Science, or a related field Experience in B2B SaaS, automotive technology, or AI-driven startups. Knowledge of working with distributed teams across time zones. A proactive mindset, with the ability to work autonomously in fast-paced environments. #LI-RG1
Posted 1 month ago
5 - 8 years
7 - 17 Lacs
Bengaluru
Work from Office
EXP: 5+yrs Data Visualization, Data Modelling GCP, Big Query, SQL Looker Dashboard
Posted 1 month ago
6 - 11 years
0 - 1 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role: GCP Data Engineer Experience: 6-12 yrs Location: Chennai,Hyderabad,Bangalore,Pune,Gurgaon Required Skillset =>Should have experience in Big query, Dataflow , Cloud SQL , Cloud composer =>Should have experience in Python , Vertex and Data flow Interested candidates can send resume to jegadheeswari.m@spstaffing.in or reach me @9566720836
Posted 1 month ago
11 - 21 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
15 - 21 years
15 - 19 Lacs
Bengaluru
Work from Office
Overview We are seeking a highly skilled Senior Architect with extensive experience in leading Hadoop to BigQuery migration projects. As a key member of our team, you will be responsible for designing and implementing scalable solutions that leverage Google Cloud Platform (GCP) technologies, specifically focusing on BigQuery. Your expertise in Big Data technologies, Python scripting, and database migration will be crucial in ensuring the successful transition from Hadoop to BigQuery Responsibilities Key Responsibilities: Lead the architecture and design phases of the Hadoop to BigQuery migration project. Develop strategies and solutions for efficient data ingestion, processing, and storage on GCP. Collaborate with cross-functional teams to ensure alignment with business requirements and technical specifications. Implement best practices for data governance, security, and performance optimization. Provide technical guidance and mentorship to junior team members. Mandatory Skills: GCP Data Engineer Certification Extensive experience with Big Data technologies and frameworks (e.g., Hadoop, BigQuery). Proficiency in Python for scripting and automation. Hands-on experience with Apache Airflow for workflow orchestration. Proven track record in database migration projects, particularly from Hadoop to BigQuery. Optional Skills: Experience with Scala, PySpark, and Spark SQL. Proficiency in Java programming. Familiarity with Informatica or similar ETL tools. Requirements A Bachelor's degree in any field, with a desired MSc/BE/Masters with 15+ years of experience in software development and architecture roles. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Experience working in multi-channel delivery projects
Posted 1 month ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 1 month ago
8 - 13 years
13 - 23 Lacs
Jaipur
Hybrid
Your skills and experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions for our stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholders requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams
Posted 1 month ago
2 - 6 years
7 - 11 Lacs
Hyderabad
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform
Posted 1 month ago
2 - 5 years
4 - 8 Lacs
Pune
Work from Office
About The Role Process Manager - GCP Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store,organize,and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical and Functional Skills: Bachelors Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 1 month ago
2 - 5 years
4 - 8 Lacs
Mumbai
Work from Office
About The Role The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Role and responsibilities: Set up user journey dashboards across customer touchpoints (web and mobile app) in Adobe analytics, GA4 and Amplitude and identify pain points, familiarity with auditing tags using Omibug, GA debugger and other relevant tools. Understanding and familiarity with cross-device analyses using combined reporting suites, virtual report suite and familiarity with people metric and data warehouse. Building complex segments in analytics tools by going through online user journeys, self-serving tag audit to build segments. Analysis of customer journey and recommend personalization tests on digital properties by using Adobe analytics, GA4, Amplitude or any equivalent tool, walk through of analysis outcome and come up with ideasto optimize digital user experience. Website and mobile app Optimization consulting for client accounts across industries (customer journey analyses and personalization). Familiarity with website measurement strategy, identifying key KPIs and define goals, integrating online and offline data and segmentation strategies. Connect with clients for business requirements, walk through analysis outcome and come up with ideas for optimization of the digital properties. Build analytical reports and dashboards using visualization tools like LookerStudio or PowerBI. Technical and Functional Skills: Bachelors Degree with overall experience of 6-10 years in digital analytics and optimization (Adobe Analytics, GA4, Appsflyer and Amplitude). Specialism- Adobe Analytics or GA4 and App analytics tools like Amplitude, Appsflyer, Visualization tools Expert- LookerStudio or PowerBI Certification in Adobe Analytics Business practitioner preferred. Ability to drive business inference from quantitative and qualitative datasets. Ability to collaborate with stakeholders across the globe. Strong communication, creative and innovation skills to help develop offerings to meet market needs.
Posted 1 month ago
7 - 12 years
0 - 1 Lacs
Bengaluru
Work from Office
Job Description- GCP Data Engineer TECH STACK GCP data integration and resource management Big Query DataForms Python (good to have) Terraform (nice to have) DevOps (is a plus) Experience working with Analytics products is a plus Experience working with Security and Compliance teams and/or on DevOps Qualifications: Formal qualifications in computer science, software engineering, or any engineering equivalent Minimum 9+ years (Expert SWE) or 7 years (Senior SWE) of professional experience as software engineer with similar level of experience in the specific tech stack for the area Minimum 5 years (Expert SWE) or 3 years (Senior SWE) experience of working in agile/iterative software development teams with a DevOps working setup and with an emphasis on self-organisation and delivery to agreed commitments Demonstrable experience with cloud computing environments Excellent written and verbal English communication skills
Posted 1 month ago
8 - 13 years
10 - 15 Lacs
Jaipur, Rajasthan
Work from Office
Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and .
Posted 1 month ago
8 - 12 years
20 - 25 Lacs
Gandhinagar
Remote
Requirement : 8+ years of professional experience as a data engineer and 2+ years of professional experience as a senior data engineer Must have strong working experience in Python and its various data analysis packages Pandas / NumPy Must have strong understanding of prevalent cloud ecosystems and experience in one of the cloud platforms AWS / Azure / GCP . Must have strong working experience in one of the leading MPP Databases Snowflake / Amazon Redshift / Azure Synapse / Google Big Query Must have strong working experience in one of the leading data orchestration tools in cloud – Azure Data Factory / Amazon Glue / Apache Airflow Must have experience working with Agile methodologies, Test Driven Development, and implementing CI/CD pipelines using one of leading services – GITLab / Azure DevOps / Jenkins / AWS Code Pipeline / Google Cloud Build Must have Data Governance / Data Management / Data Quality project implementation experience Must have experience in big data processing using Spark Must have strong experience with SQL databases (SQL Server, Oracle, Postgres etc.) Must have stakeholder management experience and very good communication skills Must have working experience on end-to-end project delivery including requirement gathering, design, development, testing, deployment, and warranty support Must have working experience with various testing levels, such as, unit testing, integration testing and system testing Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures Nice to have Skills : Working experience in DataBricks notebooks and managing DataBricks clusters Experience in Data Modelling tool such as Erwin or ER Studio Experience in one of the data architectures, such as Data Mesh or Data Fabric Has handled real time data or near real time data Experience in one of the leading Reporting & analysis tools, such as Power BI, Qlik, Tableau or Amazon Quick Sight Working experience with API integration General insurance / banking / finance domain understanding
Posted 1 month ago
6 - 10 years
15 - 20 Lacs
Gurugram
Remote
Title: Looker Developer Team: Data Engineering Work Mode: Remote Shift Time: 3:00 PM - 12:00AM IST Contract: 12 months Key Responsibilities Collaborate closely with engineers, architects, business analysts, product owners, and other team members to understand the requirements and develop test strategies. LookML Proficiency: LookML is Looker's proprietary language for defining data models. Looker developers need to be able to write, debug, and maintain LookML code to create and manage data models, explores, and dashboards. Data Modeling Expertise:Understanding how to structure and organize data within Looker is essential. This involves mapping database schemas to LookML, creating views, and defining measures and dimensions. SQL Knowledge: Looker leverages SQL queries under the hood. Developers need to be able to write SQL to understand the data, debug queries, and potentially extend LookML with custom SQL. Looker Environment: Familiarity with the Looker interface, including the IDE, LookML Validator, and SQL Runner, is necessary for efficient development. Education and/or Experience Bachelor's degree in MIS, Computer Science, Information Technology or equivalent required 6+ Years of IT Industry experience in Data management field.
Posted 1 month ago
4 - 8 years
15 - 30 Lacs
Bengaluru
Remote
Job Title: Senior GCP Data DevOps Engineer Job Type: Remote Exp: 4+ years Position Overview: As a Senior DevOps Engineer specializing in Google Cloud Platform (GCP), you will play a crucial role in designing, implementing, and managing our cloud infrastructure to ensure optimal performance, scalability, and reliability. You will collaborate closely with cross-functional teams to streamline development processes, automate deployment pipelines, and enhance overall system efficiency. Responsibilities: Design, implement, and manage scalable and highly available cloud infrastructure on Google Cloud Platform (GCP) to support our applications and services. Develop and maintain CI/CD pipelines to automate the deployment, testing, and monitoring of applications and microservices. Collaborate with software engineering teams to optimize application performance, troubleshoot issues, and ensure smooth deployment processes. Implement and maintain infrastructure as code (IaC) using tools such as Terraform , Ansible, or Google Deployment Manager. Monitor system health, performance, and security metrics, and implement proactive measures to ensure reliability and availability. Implement best practices for security, compliance, and data protection in cloud environments. Continuously evaluate emerging technologies and industry trends to drive innovation and improve infrastructure efficiency. Mentor junior team members and provide technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 4-8 years of experience in a DevOps role, with a focus on Google Cloud Platform (GCP). In-depth knowledge of GCP services such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL , Pub/Sub, and BigQuery . Proficiency in scripting languages such as Python , Bash, or PowerShell. Experience with containerization technologies such as Docker and container orchestration platforms like Kubernetes. Strong understanding of CI/CD concepts and experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Solid understanding of infrastructure as code (IaC) principles and experience with tools such as Terraform, Ansible, or Google Deployment Manager. Experience with monitoring and logging tools such as Prometheus, Grafana, Stackdriver, or ELK Stack. Knowledge of security best practices and experience implementing security controls in cloud environments. Excellent problem-solving skills and ability to troubleshoot complex issues in distributed systems. Strong communication skills and ability to collaborate effectively with cross-functional teams. Preferred Qualifications: Google Cloud certification (e.g., Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud platforms such as AWS or Azure. Familiarity with agile methodologies and DevOps practices. Experience with software development using languages such as Java, Node.js, or Go. Knowledge of networking concepts and experience with configuring network services in cloud environments. Skills: Gcp CloudSQLBigqueryKubernetesIac ToolsCi Cd PipelineTerraformPythonAirflowSnowflakePower BiIacData FlowPubsubCloud StorageCloud Computing
Posted 1 month ago
3 - 6 years
9 - 13 Lacs
Bengaluru
Work from Office
locationsTower 02, Manyata Embassy Business Park, Racenahali & Nagawara Villages. Outer Ring Rd, Bangalore 540065 time typeFull time posted onPosted 5 Days Ago job requisition idR0000388711 About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Targets investments in technology and innovation. Were the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. AtTarget, we are gearing up for exponential growth and continuously expanding our guest experience. To support this expansion, Data Engineering is building robust warehouses and enhancing existing datasets to meet business needs across the enterprise. We are looking for talented individuals who are passionate about innovative technology, data warehousing and are eager to contribute to data engineering. . Position Overview Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams. Analyze technical issues and questions identifying data needs and delivery mechanisms Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion. Manage and execute against agile plans and set deadlines based on client, business, and technical requirements Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations Ensure all code adheres to development & security standards About you 4 year degree or equivalent experience 5+ years of software development experience preferably in data engineering/Hadoop development (Hive, Spark etc.) Hands on Experience in Object Oriented or functional programming such as Scala / Java / Python Knowledge or experience with a variety of database technologies (Postgres, Cassandra, SQL Server) Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns Experience with cloud platforms like Google Cloud, AWS, or Azure. Hands on Experience on BigQuery will be an added advantage Good understanding of distributed storage(HDFS, Google Cloud Storage, Amazon S3) and processing(Spark, Google Dataproc, Amazon EMR or Databricks) Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus Familiarity with data warehousing concepts and technologies. Maintains technical knowledge within areas of expertise Constant learner and team player who enjoys solving tech challenges with global team. Hands on experience in building complex data pipelines and flow optimizations Be able to understand the data, draw insights and make recommendations and be able to identify any data quality issues upfront Experience with test-driven development and software test automation Follow best coding practices & engineering guidelines as prescribed Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Life at Target- Benefits- Culture-
Posted 1 month ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 1 month ago
8 - 13 years
12 - 20 Lacs
Bengaluru
Hybrid
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Google Cloud Platform Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Anu bachelors degree Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. You will deploy infrastructure and platform environments, create proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the implementation of cloud solutions - Optimize cloud infrastructure for performance and cost-efficiency - Troubleshoot and resolve technical issues Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture - Strong understanding of cloud architecture principles - Experience with DevOps practices - Experience with Google Cloud SQL - Hands-on experience in cloud deployment and management - Knowledge of security best practices in cloud environments Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture - This position is based at our Bengaluru office - A bachelors degree is required
Posted 1 month ago
12 - 17 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and contribute to key decisions. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Lead the development and implementation of new features. Conduct code reviews and ensure coding standards are met. Troubleshoot and resolve complex technical issues. Professional & Technical Skills: Must To Have Skills: Proficiency in Google BigQuery. Strong understanding of data modeling and database design. Experience with cloud platforms like Google Cloud Platform. Knowledge of SQL and query optimization techniques. Hands-on experience in developing scalable applications. Good To Have Skills: Experience with data warehousing solutions. Additional Information: The candidate should have a minimum of 12 years of experience in Google BigQuery. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.