Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 years
0 Lacs
India
On-site
Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer with expertise in Python to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Use a data-driven approach and actively work on product & technology roadmap at strategy level and day-to-day tactical level. Have a proven experience as a Data Engineer with a focus on Python. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with a partner or customer to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You are an experienced developer – you ideally have 4 or more years of professional experience Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Hands-on experience with data pipeline orchestration tools like Apache Airflow or Prefect for workflow automation. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Expertise in solutions design. Demonstrable experience with writing unit and functional tests. Ability to deliver against several initiatives simultaneously as a multiplier. Required Skills (Python) You are an experienced developer - a minimum of 4+ years of professional experience Python experience, preferably both 2.7 and 3.x Strong Python knowledge - familiar with OOPs, data structures and algorithms Work experience & strong proficiency in Python and its associated frameworks (like Flask, FastAPI etc.) Experience in designing and implementing scalable microservice architecture Familiarity with RESTful APIs and integration of third-party APIs 2+ years building and managing APIs to industry-accepted RESTful standards Demonstrable experience with writing unit and functional tests Application of industry security best practices to application and system development Experience with database systems such as PostgreSQL, MySQL, or MongoDB Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience Serverless architecture, preferably AWS Lambda Solid CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Knowledge of modern authorization mechanisms, such as JSON Web Token Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcree LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Site Reliability Engineering (SRE) at Equifax is a discipline that combines software and systems engineering for building and running large-scale, distributed, fault-tolerant systems. SRE ensures that internal and external services meet or exceed reliability and performance expectations while adhering to Equifax engineering principles. SRE is also an engineering approach to building and running production systems – we engineer solutions to operational problems. Our SREs are responsible for overall system operation and we use a breadth of tools and approaches to solve a broad set of problems. Practices such as limiting time spent on operational work, blameless postmortems, proactive identification, and prevention of potential outages. Our SRE culture of diversity, intellectual curiosity, problem solving and openness is key to its success. Equifax brings together people with a wide variety of backgrounds, experiences and perspectives. We encourage them to collaborate, think big, and take risks in a blame-free environment. We promote self-direction to work on meaningful projects, while we also strive to build an environment that provides the support and mentorship needed to learn, grow and take pride in our work What You’ll Do Troubleshoot and support the dev teams with their continuous integration and continuous deployment processes (CI/CD). Assist in resolving complex issues arising from product upgrades, installations and configurations Design and improve automation tools that integrate with: Docker, Kubernetes, Helm, Terraform, GitHub Actions, GCP Develop and execute best practices, system hardening and security controls, contribute to providing solution architectures and strategy You will automate system scalability and continually work to improve system resiliency, performance and efficiency Configuration of monitoring and APM tools such as: Datadog, AppDynamics, Grafana and Prometheus Partner with respective departments to develop practical automation solutions and participate in cross functional team meetings to collaborate and ensure successful execution Diagnose and deploy complex systems that may involve coordination with external teams. Maintain internal documentation that fully reflects all activity related to an application and environment to be used by applicable teams Respond and work incident tickets in ServiceNow regarding items such as service outages, infrastructure issues, zero day vulnerability patching, etc. Design and implement delivery pipelines, including test automation, security, and performance Assist in resolving complex issues arising from product upgrades, installations and configurations Comply with all corporate and departmental privacy and data security policies and practice You will influence and design infrastructure, architecture, standards and methods for large-scale systems You will support services prior to production via infrastructure design, software platform development, load testing, capacity planning and launch reviews You will maintain services during deployment and in production by measuring and monitoring key performance and service level indicators including availability latency, and overall system health What Experience You Need Bachelor's degree in Computer Science or related technical field involving coding (e.g., physics or mathematics), or equivalent job experience required 5+ years of experience developing and/or administering software in public cloud 5+ years of experience in languages such as Python, Bash, Java, Go, JavaScript and/or node.js or similar skills 5+ years of experience in system administration skills, including automation and orchestration of Linux/Windows using Chef, Puppet, and/or containers (Docker, Kubernetes, etc.) or similar skills Experience with build systems such as GitHub Actions, Jenkins Experience with configuration management tools such as Chef, Ansible, Powershell DSC Experience with infrastructure-as-code technologies (Terraform, GCP Deployment Manager) Experience with Kubernetes (GKE preferred) What Could Set You Apart Technical knowledge about monitoring tools, Splunk, security controls, networking (firewalls, ingress/egress routing) Experience with GCP, such as autoscaling, Google Cloud Functions, Google Cloud Dataflow, Google Cloud Pub/Sub, IAM Experience with web servers such as Apache or Nginx You have expertise designing, analyzing and troubleshooting large-scale distributed systems. You take a system problem-solving approach, coupled with strong communication skills and a sense of ownership and drive You are passionate for automation with a desire to eliminate toil whenever possible You’ve built software or maintained systems in a highly secure, regulated or compliant industry You thrive in and have experience and passion for working within a DevOps culture and as part of a team Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
WPP is the creative transformation company. We use the power of creativity to build better futures for our people, planet, clients, and communities. Working at WPP means being part of a global network of more than 100,000 talented people dedicated to doing extraordinary work for our clients. We operate in over 100 countries, with corporate headquarters in New York, London and Singapore. WPP is a world leader in marketing services, with deep AI, data and technology capabilities, global presence and unrivalled creative talent. Our clients include many of the biggest companies and advertisers in the world, including approximately 300 of the Fortune Global 500. Our people are the key to our success. We're committed to fostering a culture of creativity, belonging and continuous learning, attracting and developing the brightest talent, and providing exciting career opportunities that help our people grow. Why we're hiring: WPP is at the forefront of the marketing and advertising industry's largest transformation. Our Global CIO is leading a significant evolution of our Enterprise Technology capabilities, bringing together over 2,500 technology professionals into an integrated global team. This team will play a crucial role in enabling the ongoing transformation of our agencies and functions. GroupM is the world’s leading media investment company responsible for more than $63B in annual media investment through agencies Mindshare, MediaCom, Wavemaker, Essence and m/SIX, as well as the results-driven programmatic audience company, Xaxis and data and technology company Choreograph. GroupM’s portfolio includes Data & Technology, Investment and Services, all united in a vision to shape the next era of media where advertising works better for people. By leveraging all the benefits of scale, the company innovates, differentiates and generates sustained value for our clients wherever they do business. The GroupM IT team in WPP IT are the technology solutions partner for the GroupM group of agencies and are accountable for co-ordinating and assuring end-to-end change delivery, managing the GroupM IT technology life cycle and innovation pipeline. This role will work as part of the Business Platform Team for EMEA. You will be part of a new team Data team in Chennai, that will support our existing and future BI setup for EMEA markets. You will be responsible for delivering the solutions formulated by product owners and key stakeholders for different EMEA markets. In collaboration with the Data development team, you update the architecture and data models to new data needs and changes in source systems What you'll be doing: Design, develop, and maintain robust and scalable data pipelines using Google Cloud Platform services (such as Dataflow, Pub/Sub, and Cloud Composer) to extract, transform, and load data from various sources. Communicate technical concepts and benefits of GCP solutions to non-technical stakeholders, aiding them in understanding necessary changes for project goals. Monitor and optimize data processing and storage resources on GCP. Troubleshoot and resolve data pipeline issues and performance bottlenecks. Document data engineering processes, best practices, and technical specifications. Adhere to agile development practices – including evolutionary design, refactoring, continuous integration/delivery, and test-driven development. Collaborate with Business Partner Team and Stakeholders during project scoping and feasibility phases as a SME, identifying technical risks and proposing mitigations. Provide production support for data load jobs. Automate data workflows and create queries for periodic report generation. Collaborate with development teams to implement data manipulation queries ensuring alignment with business requirements. Maintain and upgrade existing applications as necessary. Participate in key design meetings and provide technical support. Perform additional tasks related to data migrations, cloud resource audits, and cross-functional support activities. What you'll need: Education: Combination of education/experience that would enable incumbent to meet fundamental duties and required competencies. Bachelor’s degree in computer science, Engineering, Mathematics or another technical field is highly preferred. Personality and Working Practice: Team player, with good communication skills, analytical thinking, attention to detail and the ability to think about the big picture. Required Experience and Knowledge: +5 years’ experience with GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer and Cloud Function Understanding of GCP security best practices, including IAM roles and service accounts 1 or 2 + years of strong experience in Python development (object-oriented/Functional Programming, Pandas, Pyspark etc) Experience with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, Cloud Deployment Manager) Desirable Experience and Knowledge: Knowledge and some experience with DBT Experience with designing and implementing data pipelines using tools like Apache Beam, Apache Airflow, or similar Languages: Very good English skills, any other language is an addition. Who you are: You're open : We are inclusive and collaborative; we encourage the free exchange of ideas; we respect and celebrate diverse views. We are open-minded: to new ideas, new partnerships, new ways of working. You're optimistic : We believe in the power of creativity, technology and talent to create brighter futures or our people, our clients and our communities. We approach all that we do with conviction: to try the new and to seek the unexpected. You're extraordinary: we are stronger together: through collaboration we achieve the amazing. We are creative leaders and pioneers of our industry; we provide extraordinary every day. What we'll give you: Passionate, inspired people – We aim to create a culture in which people can do extraordinary work. Scale and opportunity – We offer the opportunity to create, influence and complete projects at a scale that is unparalleled in the industry. Challenging and stimulating work – Unique work and the opportunity to join a group of creative problem solvers. Are you up for the challenge? We believe the best work happens when we're together, fostering creativity, collaboration, and connection. That's why we’ve adopted a hybrid approach, with teams in the office around four days a week. If you require accommodations or flexibility, please discuss this with the hiring team during the interview process. WPP is an equal opportunity employer and considers applicants for all positions without discrimination or regard to particular characteristics. We are committed to fostering a culture of respect in which everyone feels they belong and has the same opportunities to progress in their careers. Please read our Privacy Notice (https://www.wpp.com/people/wpp-privacy-policy-for-recruitment) for more information on how we process the information you provide. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcree LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Tata Consultancy Services is hiring Google Data Engineer !!! Role**Google Data Engineer EXP***3-5 years Location***Bangalore, Chennai, Hyderabad Experience Experience level of 3 to 5 years in data engineering, data warehousing, or a related field. Experience with dashboarding tools like plx dashboard and looker studio Experience with building data pipelines, reports, best practices and frameworks. Experience with design and development of scalable and actionable solutions (dashboards, automated collateral, web applications). Experience with code refactoring for optimal performance. Experience writing and maintaining ETLs which operate on a variety of structured and unstructured sources. Familiarity with non-relational data storage systems (NoSQL and distributed database management systems). Skills Strong proficiency in SQL, NoSQL, ETL tools, BigQuery and at least one programming language (e.g., Python, Java). Strong understanding of data structures, algorithms, and software design principles. Experience with data modeling techniques and methodologies. Proficiency in troubleshooting and debugging complex data-related issues. Ability to work independently and as part of a team. Responsibilities Data Pipeline Development: Design, implement, and maintain robust and scalable data pipelines to extract, transform, and load data from various sources into our data warehouse or data lake. Data Modeling and Warehousing: Collaborate with data scientists and analysts to design and implement data models that optimize query performance and support complex analytical workloads. Cloud Infrastructure: Leverage Google Cloud and other internal storage platforms to build and manage scalable and cost-effective data storage and processing solutions. Data Quality Assurance: Implement data quality checks and monitoring processes to ensure the accuracy, completeness, and consistency of data. Performance Optimization: Continuously monitor and optimize data pipelines and queries for performance and efficiency. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand their data needs and deliver solutions that meet their requirements. Desirable Experience Cloud Storage or equivalent cloud platforms Knowledge of BigQuery ingress and egress patterns Experience in writing Airflow DAGs Knowledge of pubsub,dataflow or any declarative data pipeline tools using batch and streaming ingestion Other GCP Services: Vertex AI Interested candidates can apply !!! Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g.Angular, React, Node.js). Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Responsibilities Key Job responsibilities: Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications Qualification-Btech, Mtech Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Overview Viraaj HR Solutions is a leading provider of human resources services, dedicated to empowering businesses with top-notch talent acquisition and management solutions. We pride ourselves on our commitment to excellence and our innovative approach to meeting client needs. Our mission is to enhance organizational efficiency and productivity through strategic workforce planning. We value integrity, collaboration, and continuous improvement in everything we do. Job Title: GCP Data Engineer Work Mode: On-site Location: India Role Responsibilities Design and implement scalable data pipelines in Google Cloud Platform (GCP). Develop data models and architecture for data warehousing solutions. Create ETL (Extract, Transform, Load) processes to streamline data management. Optimize data flows and processes for efficiency and performance. Collaborate with data scientists and analysts to understand data requirements and design optimal solutions. Manage and monitor data ingestion from various sources. Conduct data quality checks and troubleshoot issues related to data integrity. Utilize BigQuery for data analysis and reporting tasks. Implement data security measures to ensure compliance with regulations. Document processes, data models, and reports for reference and training purposes. Stay current with emerging technologies and best practices in data engineering. Engage in code reviews, troubleshooting, and debugging of data solutions. Work closely with the development team to integrate data processes into applications. Train and support team members in GCP tools and data engineering principles. Prepare and present reports on data insights and performance metrics. Qualifications Bachelor's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering roles. Strong experience with Google Cloud Platform, particularly BigQuery and Dataflow. Expertise in data modeling and database design. Proficient in Python and SQL programming. Hands-on experience with ETL tools and methodologies. Understanding of data warehousing concepts and architectures. Familiarity with cloud architecture and services. Excellent analytical and problem-solving skills. Strong communication and collaboration abilities. Experience with Agile methodologies is a plus. Ability to work independently and manage multiple tasks simultaneously. Relevant certifications in GCP or data engineering are advantageous. Experience with big data technologies such as Hadoop or Spark is a plus. Commitment to continuous learning and professional development. Skills: data security,gcp,data modeling,cloud architecture,etl,data engineering,big query,python,google cloud platform,dataflow,spark,bigquery,database design,sql,data warehousing,hadoop,agile methodologies Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Overview Viraaj HR Solutions is a leading provider of human resources services, dedicated to empowering businesses with top-notch talent acquisition and management solutions. We pride ourselves on our commitment to excellence and our innovative approach to meeting client needs. Our mission is to enhance organizational efficiency and productivity through strategic workforce planning. We value integrity, collaboration, and continuous improvement in everything we do. Job Title: GCP Data Engineer Work Mode: On-site Location: India Role Responsibilities Design and implement scalable data pipelines in Google Cloud Platform (GCP). Develop data models and architecture for data warehousing solutions. Create ETL (Extract, Transform, Load) processes to streamline data management. Optimize data flows and processes for efficiency and performance. Collaborate with data scientists and analysts to understand data requirements and design optimal solutions. Manage and monitor data ingestion from various sources. Conduct data quality checks and troubleshoot issues related to data integrity. Utilize BigQuery for data analysis and reporting tasks. Implement data security measures to ensure compliance with regulations. Document processes, data models, and reports for reference and training purposes. Stay current with emerging technologies and best practices in data engineering. Engage in code reviews, troubleshooting, and debugging of data solutions. Work closely with the development team to integrate data processes into applications. Train and support team members in GCP tools and data engineering principles. Prepare and present reports on data insights and performance metrics. Qualifications Bachelor's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering roles. Strong experience with Google Cloud Platform, particularly BigQuery and Dataflow. Expertise in data modeling and database design. Proficient in Python and SQL programming. Hands-on experience with ETL tools and methodologies. Understanding of data warehousing concepts and architectures. Familiarity with cloud architecture and services. Excellent analytical and problem-solving skills. Strong communication and collaboration abilities. Experience with Agile methodologies is a plus. Ability to work independently and manage multiple tasks simultaneously. Relevant certifications in GCP or data engineering are advantageous. Experience with big data technologies such as Hadoop or Spark is a plus. Commitment to continuous learning and professional development. Skills: data security,gcp,data modeling,cloud architecture,etl,data engineering,big query,python,google cloud platform,dataflow,spark,bigquery,database design,sql,data warehousing,hadoop,agile methodologies Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
🗨Exciting Opportunity for Data Engineers with 5+ years of Experience at Telus Digital!!!🗨 If you are looking for change and have the below skills, Please DM me, i would be happy to refer you. Skillset Required:- Python SQL BigQuery Composer/Airflow Dataflow CI/CD Show more Show less
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Chandigarh
Work from Office
Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Viraaj HR Solutions is a leading provider of human resources services, dedicated to empowering businesses with top-notch talent acquisition and management solutions. We pride ourselves on our commitment to excellence and our innovative approach to meeting client needs. Our mission is to enhance organizational efficiency and productivity through strategic workforce planning. We value integrity, collaboration, and continuous improvement in everything we do. Job Title: GCP Data Engineer Work Mode: On-site Location: India Role Responsibilities Design and implement scalable data pipelines in Google Cloud Platform (GCP). Develop data models and architecture for data warehousing solutions. Create ETL (Extract, Transform, Load) processes to streamline data management. Optimize data flows and processes for efficiency and performance. Collaborate with data scientists and analysts to understand data requirements and design optimal solutions. Manage and monitor data ingestion from various sources. Conduct data quality checks and troubleshoot issues related to data integrity. Utilize BigQuery for data analysis and reporting tasks. Implement data security measures to ensure compliance with regulations. Document processes, data models, and reports for reference and training purposes. Stay current with emerging technologies and best practices in data engineering. Engage in code reviews, troubleshooting, and debugging of data solutions. Work closely with the development team to integrate data processes into applications. Train and support team members in GCP tools and data engineering principles. Prepare and present reports on data insights and performance metrics. Qualifications Bachelor's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering roles. Strong experience with Google Cloud Platform, particularly BigQuery and Dataflow. Expertise in data modeling and database design. Proficient in Python and SQL programming. Hands-on experience with ETL tools and methodologies. Understanding of data warehousing concepts and architectures. Familiarity with cloud architecture and services. Excellent analytical and problem-solving skills. Strong communication and collaboration abilities. Experience with Agile methodologies is a plus. Ability to work independently and manage multiple tasks simultaneously. Relevant certifications in GCP or data engineering are advantageous. Experience with big data technologies such as Hadoop or Spark is a plus. Commitment to continuous learning and professional development. Skills: data security,gcp,data modeling,cloud architecture,etl,data engineering,big query,python,google cloud platform,dataflow,spark,bigquery,database design,sql,data warehousing,hadoop,agile methodologies Show more Show less
Posted 2 weeks ago
100.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 10 S&P Global Mobility The Role: Senior Data Engineer Department Overview Automotive Insights at S&P Mobility, leverages technology and data science to provide unique insights, forecasts and advisory services spanning every major market and the entire automotive value chain—from product planning to marketing, sales and the aftermarket. We provide the most comprehensive data spanning the entire automotive lifecycle—past, present and future. With over 100 years of history, unmatched credentials, and the largest base of customers than any other provider, we are the industry benchmark for clients around the world, helping them make informed decisions to capitalize on opportunity and avoid risk. Our solutions are used by nearly every major OEM, 90% of the top 100 tier one suppliers, media agencies, governments, insurance companies, and financial stakeholders to provide actionable insights that enable better decisions and better results. Position Summary S&P Global is seeking an experienced and driven Senior data Engineer who is passionate about delivering high-value, high-impact solutions to the world’s most demanding, high-profile clients. The ideal candidate must have at least 5 years of experience in developing and deploying data pipelines on Google Cloud Platform (GCP). They should be passionate about building high-quality, reusable pipelines using cutting-edge technologies. This role involves designing, building, and maintaining scalable data pipelines, optimizing workflows, and ensuring data integrity across multiple systems. The candidate will collaborate with data scientists, analysts, and software engineers to develop robust and efficient data solutions. Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines. Optimize and automate data ingestion, transformation, and storage processes. Work with structured and unstructured data sources, ensuring data quality and consistency. Develop and maintain data models, warehouses, and databases. Collaborate with cross-functional teams to support data-driven decision-making. Ensure data security, privacy, and compliance with industry standards. Troubleshoot and resolve data-related issues in a timely manner. Monitor and improve system performance, reliability, and scalability. Stay up-to-date with emerging data technologies and recommend improvements to our data architecture and engineering practices. What You Will Need Strong programming skills using python. 5+ years of experience in data engineering, ETL development, or a related role. Proficiency in SQL and experience with relational (PostgreSQL, MySQL, etc.) and NoSQL (DynamoDB, MongoDB etc…) databases. Proficiency building data pipelines in Google cloud platform(GCP) using services like DataFlow, Cloud Batch, BigQuery, BigTable, Cloud functions, Cloud Workflows, Cloud Composer etc.. Strong understanding of data modeling, data warehousing, and data governance principles. Should be capable of mentoring junior data engineers and assisting them with technical challenges. Familiarity with orchestration tools like Apache Airflow. Familiarity with containerization and orchestration (Docker, Kubernetes). Experience with version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and ability to work in a fast-paced environment. Excellent communication skills. Hands-on experience with snowflake is a plus. Experience with big data technologies (Hadoop, Spark, Kafka, etc.) is a plus. Experience in AWS is a plus. Should be able to convert business queries into technical documentation. Education And Experience Bachelor’s degree in Computer Science, Information Systems, Information Technology, or a similar major or Certified Development Program 5+ years of experience building data pipelines using python & GCP (Google Cloud platform). About Company Statement S&P Global delivers essential intelligence that powers decision making. We provide the world’s leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, you’ll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand today’s market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 314204 Posted On: 2025-05-07 Location: Gurgaon, Haryana, India Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Gurugram
Work from Office
Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.
Posted 3 weeks ago
10.0 - 15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Dear Candidate, Greetings from TCS !!! TCS is hiring for Application Solution Architect (GCP), please find the below JD….. About the Company TCS is a leading global IT services, consulting, and business solutions organization that delivers real results to global businesses, ensuring a level of certainty no other firm can match. About the Role The Application Solution Architect (GCP) will be responsible for leading cloud transformation programs and ensuring successful application and data migration to Google Cloud Platform. Responsibilities Google Cloud Architect Experience in large scale cloud transformation programs Hands on Experience in application & Data Migration to Cloud using Google migration tools Designing & Developing Google Cloud Based Systems using cloud native tools Expertise in GCP (GKE, Cloud Dataflow, Cloud Run and Cloud Functions, Cloud Build, BigQuery, IAM, VPC, Cloud DNS) Expertise In Storage and Databases - Utilizing GCP storage services such as Google Cloud Storage for object storage, Cloud SQL for relational databases, and BigQuery for data analytics GCP certified solution architect - Professional Qualifications Experience range – 10 to 15 years Location - PAN INDIA Required Skills Application Solution Architect Microservices .NET Architect Database Design Database modelling AWS Preferred Skills GCP certification Pay range and compensation package Details regarding pay range or salary will be discussed during the interview process. Equal Opportunity Statement TCS is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Data Engineer Location: Chennai (Work From Office – Monday to Friday) Relevant exp:- 5 Years in Data Engineer Shift Timing: 1:00 PM – 10:00 PM CTC: Up to ₹15 LPA Notice Period: Immediate to 15 Days Key Responsibilities: Design and develop secure, scalable, and high-performance data pipelines and data models Lead the end-to-end ETL/ELT lifecycle, including implementation of Slowly Changing Dimension (SCD) Type 2 Collaborate with data scientists, analysts, and engineering teams to define and deliver on data requirements Maintain and optimize cloud-based data infrastructure (AWS, GCP, or Azure) Design and implement logical and physical data models using tools like Erwin or MySQL Workbench Promote and implement data governance practices, including data quality checks, lineage tracking, and catalog management Ensure compliance with organizational policies and data privacy regulations Mentor junior engineers, perform code reviews, and promote engineering best practices Required Qualifications & Skills: Bachelor’s or master’s degree in computer science, Engineering, or a related field Minimum of 5 years of experience in a Data Engineering or similar role Strong proficiency in SQL and Python (or equivalent languages like Scala/Java) Experience with data orchestration tools such as Apache Airflow, dbt, etc. Hands-on expertise in cloud platforms: AWS (Redshift, S3, Glue), GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse) Familiarity with big data technologies such as Apache Spark, Kafka, Hive Solid understanding of data warehousing, data modeling, and performance tuning Strong analytical, problem-solving, and team collaboration skills Show more Show less
Posted 3 weeks ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview This role is responsible for Snacks Actuals reporting & forecasting for the UK Foods business Responsibilities This role is responsible for Snacks Actuals reporting & forecasting for the UK Foods business including Reporting Actual performance results Reconciling actuals b/w SAP & TM1 Preparing and presenting Vol, GR & D&A variance analysis to senior stakeholders Present performance narrative with meaningful commentary Forecasting of snacks performance Coordinate with business partners to finalize forecast stream and overlays Prepare and present NR Cause of Change Present performance narrative with meaningful commentary Operate tasks & reserves in TM1 Planning process Coordinate and work with Supply chain, pricing teams to firm up AOP iterations Enable TM1 submissions to sector Identify and support continuous improvements including simplifications, process or control remediations. Management of Dashboaring tools Power-BI toolkit maintenance User access management Creation of views with business insights Linkage streaming and management of dataflow into Tablaue dashboard (Cockpit) Project / Analytics support Provide Topline commercial analytics support Contribute to ad-hoc analysis to draw impactful business insights Qualifications CA with 6 years of experience MBA (F) 7-8 years exp Digitally adept - SAP / TM1 Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Bangalore / Chennai Hands-on data modelling for OLTP and OLAP systems In-depth knowledge of Conceptual, Logical and Physical data modelling Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same Strong understanding of variables impacting database performance for near-real-time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery. People with functional knowledge of the mutual fund industry will be a plus Role & Responsibilities Work with business users and other stakeholders to understand business processes. Ability to design and implement Dimensional and Fact tables Identify and implement data transformation/cleansing requirements Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions Design, develop and maintain ETL workflows and mappings using the appropriate data load technique Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions. Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use. Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality. Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions. Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics. Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions. Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements. Train business end-users, IT analysts, and developers. Required Skills Bachelor’s degree in Computer Science or similar field or equivalent work experience. 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects. Expert with data warehousing concepts, strategies, and tools. Strong SQL background. Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL. Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS). Knowledge of AWS and Azure Cloud is a plus. Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources. Experience in integration using APIs, XML, JSONs etc. Skills:- Data modeling, OLAP, OLTP, bigquery and Google Cloud Platform (GCP) Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We’re seeking a hands-on Platform Engineer to support our enterprise data integration and enablement platform. As a Platform Engineer II, you’ll be responsible for designing, maintaining, and optimizing secure and scalable data movement services—such as batch processing, file transfers, and data orchestration. This role is essential to ensuring reliable data flow across systems to power analytics, reporting, and platform services in a cloud-native environment. Who we’re looking for: Primary Responsibilities: Hands-On Data Integration Engineering Build and maintain data transfer pipelines, file ingestion processes, and batch workflows for internal and external data sources. Configure and manage platform components that enable secure, auditable, and resilient data movement. Automate routine data processing tasks to improve reliability and reduce manual intervention. Platform Operations & Monitoring Monitor platform services for performance, availability, and failures; respond quickly to disruptions. Tune system parameters and job schedules to improve throughput and processing efficiency. Implement logging, metrics, and alerting to ensure end-to-end observability of data workflows. Security, Compliance & Support Apply secure protocols and encryption standards to data transfer processes (e.g., SFTP, HTTPS, GCS/AWS). Support compliance with internal controls and external regulations (e.g., GDPR, SOC2, PCI). Collaborate with security and infrastructure teams to manage access controls, service patches, and incident response. Troubleshooting & Documentation Investigate and resolve issues related to data processing failures, delays, or quality anomalies. Document system workflows, configurations, and troubleshooting runbooks for team use. Provide support for platform users and participate in on-call rotations as needed. Skill: 3+ years of hands-on experience in data integration , platform engineering , or infrastructure operations . Proficiency in: Designing and supporting batch and file-based data transfers Python scripting and SQL for diagnostics, data movement, and automation Terraform scripting and deploying of infrastructure cloud services Working with GCP (preferred) or AWS data analytics services, such as: GCP: Cloud Storage, BigQuery, Cloud Composer, Pub/Sub, Dataflow AWS: S3, Glue, Redshift, Athena, Lambda, EventBridge, Step Functions Cloud-native storage and compute optimization for data movement and processing Infrastructure-as-code and CI/CD practices (e.g., Terraform, Ansible, Cloud Build, GitHub Actions) Strong analytical and debugging skills for troubleshooting issues in distributed, high-volume environments. Bachelor's degree in computer science, Information Systems, or a related technical field. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary : A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design and implement scalable, efficient, and secure data pipelines on GCP, utilizing tools such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Collaborate with cross-functional teams (data scientists, analysts, and software engineers) to understand business requirements and deliver actionable data solutions. Develop and maintain ETL/ELT processes to ingest, transform, and load data from various sources into GCP-based data warehouses. Build and manage data lakes and data marts on GCP to support analytics and business intelligence initiatives. Implement automated data quality checks, monitoring, and alerting systems to ensure data integrity. Optimize and tune performance for large-scale data processing jobs in BigQuery, Dataflow, and other GCP tools. Create and maintain data pipelines to collect, clean, and transform data for analytics and machine learning purposes. Ensure data governance and compliance with organizational policies, including data security, privacy, and access controls. Stay up to date with new GCP services and features and make recommendations for improvements and new implementations. Mandatory Skill Sets GCP, Big query , Data Proc Preferred Skill Sets GCP, Big query , Data Proc, Airflow Years Of Experience Required 3-7 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 18 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 3 weeks ago
9.0 years
0 Lacs
Bengaluru, Karnataka
Remote
Job Title: Google Cloud (GCP) Data Engineer Location: Hybrid (Bengaluru) Job Type: Full-Time Experience Level: Minimum 9 years + Joining: Immediate / 1 week Client: HSBC Mandatory Skills: GCS + Google BigQuery + Airflow/Composer + Python Company Description: Tech T7 Innovations is a company that provides IT solutions to clients worldwide. The team consists of highly skilled and experienced professionals who are passionate about IT. Tech T7 Innovations offers a wide range of IT services, including software development, web design, cloud computing, cybersecurity, data engineering, data science and machine learning. The company is committed to staying up-to-date with the latest technologies and best practices to deliver the best solutions to their clients. Job Summary: We are looking for a highly experienced GCP Data Engineer with 11+ years in data engineering, and a proven track record of designing, building, and optimizing scalable data pipelines and architectures on Google Cloud Platform (GCP) . The ideal candidate is hands-on with Google Cloud Storage (GCS) , BigQuery (BQ) , Apache Airflow , and Python , and is adept at managing complex data workflows and transformations at scale. Key Responsibilities: Design and implement highly scalable, reliable, and secure data pipelines on GCP using GCS, BigQuery, and Airflow. Develop robust ETL/ELT processes using Python and integrate with data orchestration tools (e.g., Airflow). Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize BigQuery performance through efficient schema design, partitioning, clustering, and query optimization. Manage and maintain data lake and data warehouse environments ensuring data integrity and availability. Automate and monitor data pipelines to ensure consistent and reliable data delivery. Contribute to architecture decisions and ensure adherence to data governance and security standards. Mentor junior engineers and promote best practices in data engineering and cloud usage. Must-Have Skills: Google Cloud Platform (GCP): In-depth experience with core services like GCS, BigQuery, IAM, and Cloud Functions. BigQuery (BQ): Expertise in data modeling, performance tuning, and large-scale analytics. Google Cloud Storage (GCS): Strong understanding of data storage, access patterns, and integration. Apache Airflow: Experience in authoring and managing DAGs for complex workflows. Python: Proficient in scripting and automation, including working with APIs, data processing libraries (e.g., pandas, PySpark), and custom operators in Airflow. Preferred Qualifications: Experience with CI/CD pipelines for data workflows. Exposure to Dataflow, Pub/Sub, or other GCP data services. Familiarity with Terraform or Infrastructure as Code (IaC) on GCP. Strong problem-solving and communication skills. GCP certifications (e.g., Professional Data Engineer) are a plus. Job Types: Full-time, Permanent Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Benefits: Health insurance Provident Fund Work from home Schedule: Day shift Supplemental Pay: Performance bonus Experience: gcp: 9 years (Required) gcs: 9 years (Required) Apache Airflow: 9 years (Required) Python: 9 years (Preferred) Location: Bengaluru, Karnataka (Required) Work Location: In person
Posted 3 weeks ago
14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Devops Manager Location: Ahmedabad/Hyderabad Exp: 14+ years Experience Required: 14+ years total experience, with 4–5 years in managerial roles. Technical Knowledge and Skills: Mandatory: Cloud: GCP (Complete stack from IAM to GKE) CI/CD: End-to-end pipeline ownership (GitHub Actions, Jenkins, Argo CD) IaC: Terraform, Helm • Containers: Docker, Kubernetes • DevSecOps: Vault, Trivy, OWASP Nice to Have: FinOps exposure for cost optimization Big Data tools familiarity (BigQuery, Dataflow) Familiarity with Kong, Anthos, Istio Scope: Lead DevOps team across multiple pods and products Define roadmap for automation, security, and CI/CD Ensure operational stability of deployment pipelines Roles and Responsibilities: Architect and guide implementation of enterprise-grade CI/CD pipelines that support multi-environment deployments, microservices architecture, and zero downtime delivery practices. Oversee Infrastructure-as-Code initiatives to establish consistent and compliant cloud provisioning using Terraform, Helm, and policy-as-code integrations. Champion DevSecOps practices by embedding security controls throughout the pipeline—ensuring image scanning, secrets encryption, policy checks, and runtime security enforcement Lead and manage a geographically distributed DevOps team, setting performance expectations, development plans, and engagement strategies. • Drive cross-functional collaboration with engineering, QA, product, and SRE teams to establish integrated DevOps governance practices. Develop a framework for release readiness, rollback automation, change control, and environment reconciliation processes. Monitor deployment health, release velocity, lead time to recovery, and infrastructure cost optimization through actionable DevOps metrics dashboards Serve as the primary point of contact for C-level stakeholders during major infrastructure changes, incident escalations, or audits. Own the budgeting and cost management strategy for DevOps tooling, cloud consumption, and external consulting partnerships. Identify, evaluate, and onboard emerging DevOps technologies, ensuring team readiness through structured onboarding, POCs, and knowledge sessions. Foster a culture of continuous learning, innovation, and ownership—driving internal tech talks, hackathons, and community engagement Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About The Job Developers at Vendasta work in teams, working with Product Managers and Designers in the creation of new features and products. Our Research and Development department works hard to help developers learn, grow, and experiment while at work. With a group of over 100 developers, we have fostered an environment that provides developers with the opportunity to continuously learn from each other. The ideal candidate will demonstrate that they are bright and can tackle tough problems while being able to communicate their solution to others. They are creative and can mix technology with the customer's problems to find the right solution. Lastly, they are driven and will motivate themselves and others to get things done. As an experienced Software Developer, we expect that you will grow into a thought leader at Vendasta, driving better results across our development organization. Your Impact Develop software in teams of 3-5 developers, with the ability to take on tasks for the team and independently work on them to completion. Follow best practices to write clean, maintainable, scalable, and tested software. Contribute to the best engineering practices, including the use of design patterns, CI/CD, maintainable and scalable code, code review, and automated tests. Provide inputs for a technical roadmap for the Product Area. Ensure that the NFRs and technical debt get their due focus. Work collaboratively with Product Managers to design solutions (including technical roadmap) that help our Partners connect digital solutions to small and medium-sized businesses. Analyzing and improving current system integrations and migration strategies. Interact and collaborate with our high-quality technical team across India and Canada What You Bring to the Table: 8+ years experience in a related field with at least 3+ years as full stack developer in an architect or senior development role Experience or strong understanding of high scalability, data-intensive, distributed Internet applications Software development experience including building distributed, microservice-style and cloud-based application architectures Proficiency in modern software language, and willingness to quickly learn our technology stack Preference will be given to candidates with a strong Go (programming language) experience, and who can demonstrate the ability to build and adapt web applications using Angular. Experience in designing, Building and Implementing cloud-native architectures (GCP preferred). Experience working with the Scrum framework Technologies We Use Cloud Native Computing using Google Cloud Platform BigQuery, Cloud Dataflow, Cloud Pub/Sub, Google Data Studio, Cloud IAM, Cloud Storage, Cloud SQL, Cloud Spanner, Cloud Datastore, Google Maps Platform, Stackdriver, etc.. We have been invited to join the Early Access Program on quite a few GCP technologies. GoLang, Typescript, Python, JavaScript, HTML, Angular, GRPC, Kubernetes Elasticsearch, MySQL, PostgreSQL About Vendasta : So what do we do? We create an entire platform full of digital products & solutions that help small to medium-sized businesses (SMBs) have a stronger presence online through digital advertising, online listings, reputation management, website creation, social media marketing . and much more! Our platform is used exclusively by channel partners, who sell products and services to SMBs, allowing them to leverage us to scale and grow their business. We are trusted by 65,000+ channel partners, serving over 6 million SMBs worldwide! Perks Stock options (as per policy) Benefits - Health insurance Paid time offs Public transport reimbursement Flex days Training & Career Development - Professional development plans, leadership workshops, mentorship programs, and more! Free Snacks, hot beverages, and catered lunches on Fridays Culture - comprised of our core values: Drive, Innovation, Respect, and Agility Provident Fund (ref:hirist.tech) Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.
These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.
The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.
In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.
In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.
As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2