Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Engineering degree or equivalent experience 3+ years of ETL Experience using SQL Server Integration Services (SSIS) within Visual Studio 3+ years of advanced SQL Server experience. Certification in Microsoft SQL Server a plus Experience using GitHub platform for version control Experience using GitHub Copilot AI-powered code assistant Experience developing within an agile (i.e. Scrum or Kanban) framework Healthcare experience API Experience Proficiency in API Design and Architecture Understanding of RESTful Principles and GraphQL Expertise in Programming Languages (e.g. JavaScript, Python, Java) Knowledge of API Security Best Practices Proficiency with API Documentation Tools (e.g. Swagger/OpenAPI) Google Cloud Platform Experience Utilizing Google Cloud Dataflow and Google Cloud Dataproc with SQL or Python in Jupyter Notebooks to load data into BigQuery and Google Cloud Storage Implementing data processing jobs and managing data within BigQuery Creating dashboards and visualizations for business users using Google Data Studio and Looker Utilizing Google AI Platform to build and deploy machine learning models Expertise in Cloud Data Migration, Security and Administration Migrating SQL Server databases from on-premises to Google Cloud SQL, Google Cloud Spanner, and/or SQL Server on Google Compute Managing database migration projects from on-premises to Google Cloud BI report development Solid soft skills (e.g. communication, interpersonal, collaborative, resilient) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview Leading AI-driven Global Supply Chain Solutions Software Product Company and one of Glassdoor’s “Best Places to Work” Seeking an astute individual that has a strong technical foundation with the additional ability to be hands-on with the broader engineering team as part of the development/deployment cycle, and deep knowledge of industry best practices, Data Science and Machine Learning experience with the ability to implement them working with both the platform, and the product teams. Scope Our machine learning platform ingests data in real time, processes information from millions of retail items to serve deep learning models and produces billions of predictions on a daily basis. Blue Yonder Data Science and Machine Learning team works closely with sales, product and engineering teams to design and implement the next generation of retail solutions. Data Science team members are tasked with turning both small, sparse and massive data into actionable insights with measurable improvements to the customer bottom line. Our Current Technical Environment Software: Python 3.* Frameworks/Others: TensorFlow, PyTorch, BigQuery/Snowflake, Apache Beam, Kubeflow, Apache Flink/Dataflow, Kubernetes, Kafka, Pub/Sub, TFX, Apache Spark, and Flask. Application Architecture: Scalable, Resilient, Reactive, event driven, secure multi-tenant Microservices architecture. Cloud: Azure What We Are Looking For Bachelor’s Degree in Computer Science or related fields; graduate degree preferred. Solid understanding of data science and deep learning foundations. Proficient in Python programming with a solid understanding of data structures. Experience working with most of the following frameworks and libraries: Pandas, NumPy, Keras, TensorFlow, Jupyter, Matplotlib etc. Expertise in any database query language, SQL preferred. Familiarity with Big Data tech such as Snowflake , Apache Beam/Spark/Flink, and Databricks. etc. Solid experience with any of the major cloud platforms, preferably Azure and/or GCP (Google Cloud Platform). Reasonable knowledge of modern software development tools, and respective best practices, such as Git, Jenkins, Docker, Jira, etc. Familiarity with deep learning, NLP, reinforcement learning, combinatorial optimization etc. Provable experience guiding junior data scientists in official or unofficial setting. Desired knowledge of Kafka, Redis, Cassandra, etc. What You Will Do As a Senior Data Scientist, you serve as a specialist in the team that supports the team with following responsibilities. Independently, or alongside junior scientists, implement machine learning models by Procuring data from platform, client, and public data sources. Implementing data enrichment and cleansing routines Implementing features, preparing modelling data sets, feature selection, etc. Evaluating candidate models, selecting, and reporting on test performance of final one Ensuring proper runtime deployment of models, and Implementing runtime monitoring of model inputs and performance in order to ensure continued model stability. Work with product, sales and engineering teams helping shape up the final solution. Use data to understand patterns, come up with and test hypothesis; iterate. Help prepare sales materials, estimate hardware requirements, etc. Attend client meetings, online and onsite, to discuss new and current functionality Our Values If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Total Expereince: 5 to8 years Location- Pune & Hyderabad Notice Period- Immediate to 30 days Must have: · 5+ years of experience in data engineering technology and tools. · Preferred having experience with Java / Scala based implementations for enterprise-wide platforms. · Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. · Complex state-full processing of events with partitioning for higher throughputs. · Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. · Experience with analytical data store optimizations, querying and managing them. · Experience with alternate data engineering tools (Apache Flink, Apache Spark etc) · Automated CI/CD or operations concerns on the engineering platforms. · Interpreting problems from functional context and transforming them into technology solutions. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Delhi, India
On-site
Total Expereince: 5 to8 years Location- Pune & Hyderabad Notice Period- Immediate to 30 days Must have: · 5+ years of experience in data engineering technology and tools. · Preferred having experience with Java / Scala based implementations for enterprise-wide platforms. · Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. · Complex state-full processing of events with partitioning for higher throughputs. · Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. · Experience with analytical data store optimizations, querying and managing them. · Experience with alternate data engineering tools (Apache Flink, Apache Spark etc) · Automated CI/CD or operations concerns on the engineering platforms. · Interpreting problems from functional context and transforming them into technology solutions. Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are looking for an experienced Integration Technical Lead with over 10 years of in-depth experience in Oracle Fusion Middleware technologies such as SOA Suite, Oracle Service Bus (OSB), and Oracle Data Integrator (ODI). Candidate will be responsible for leading integration initiatives including custom development, platform customization, and day-to-day operational support. A strong interest in Google Cloud Platform (GCP) is highly desirable, with clear opportunities for training and skill development. ShyftLabs is a growing data product company founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries by focusing on creating value through innovation. Job Responsibilities: 1. Integration Leadership & Development: Lead end-to-end integration design and development across on-premise and cloud systems using Oracle SOA, OSB, and ODI Drive new integration projects, from requirements gathering through to deployment and support Develop, customize, and maintain reusable integration components and templates Translate complex business processes into scalable, secure, and performant integration solutions Platform Customization & Optimization: Customize Oracle Fusion middleware components to meet specific business needs and performance objectives Evaluate existing integrations and enhance them for greater efficiency and lower latency Implement best practices in integration design, error handling, and performance tuning Operational Excellence & Support: Own the operational stability of integration platforms including monitoring, incident resolution, and root cause analysis Manage daily operations such as deployments, patches, backups, and performance reviews Collaborate with IT support teams to maintain integration SLAs, uptime and reliability Cloud Integration & GCP Adoption: Contribute to the design of hybrid and cloud-native integration architectures using GCP Learn and eventually implement integration patterns using tools like Apigee, Pub/Sub, Cloud Functions, and Dataflow Participate in GCP migration initiative for legacy integration assets Basic Qualifications: 10+ years of hands-on experience with Oracle SOA Suite, OSB, and ODI in enterprise environments Expertise in web services (REST/SOAP), XML, XSD, XSLT, XPath, and service orchestration Strong skills in platform customization, new integration development, integration monitoring, alerting, and troubleshooting processes and long-term system maintenance Experience with performance optimization, fault tolerance, and secure integrations Excellent communication and team leadership skills Preferred Qualifications: Exposure to Google Cloud Platform (GCP) or strong interest and ability to learn Familiarity with GCP services for integration (Pub/Sub, Cloud Store/Functions) Understanding of containerized deployments using Docker and Kubernetes Experience with DevOps tools and CI/CD pipelines for integration delivery. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Total Expereince: 5 to8 years Location- Pune & Hyderabad Notice Period- Immediate to 30 days Must have: · 5+ years of experience in data engineering technology and tools. · Preferred having experience with Java / Scala based implementations for enterprise-wide platforms. · Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. · Complex state-full processing of events with partitioning for higher throughputs. · Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. · Experience with analytical data store optimizations, querying and managing them. · Experience with alternate data engineering tools (Apache Flink, Apache Spark etc) · Automated CI/CD or operations concerns on the engineering platforms. · Interpreting problems from functional context and transforming them into technology solutions. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Total Expereince: 5 to8 years Location- Pune & Hyderabad Notice Period- Immediate to 30 days Must have: · 5+ years of experience in data engineering technology and tools. · Preferred having experience with Java / Scala based implementations for enterprise-wide platforms. · Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. · Complex state-full processing of events with partitioning for higher throughputs. · Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. · Experience with analytical data store optimizations, querying and managing them. · Experience with alternate data engineering tools (Apache Flink, Apache Spark etc) · Automated CI/CD or operations concerns on the engineering platforms. · Interpreting problems from functional context and transforming them into technology solutions. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Total Expereince: 5 to8 years Location- Pune & Hyderabad Notice Period- Immediate to 30 days Must have: · 5+ years of experience in data engineering technology and tools. · Preferred having experience with Java / Scala based implementations for enterprise-wide platforms. · Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. · Complex state-full processing of events with partitioning for higher throughputs. · Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. · Experience with analytical data store optimizations, querying and managing them. · Experience with alternate data engineering tools (Apache Flink, Apache Spark etc) · Automated CI/CD or operations concerns on the engineering platforms. · Interpreting problems from functional context and transforming them into technology solutions. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Total Expereince: 5 to8 years Location- Pune & Hyderabad Notice Period- Immediate to 30 days Must have: · 5+ years of experience in data engineering technology and tools. · Preferred having experience with Java / Scala based implementations for enterprise-wide platforms. · Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. · Complex state-full processing of events with partitioning for higher throughputs. · Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. · Experience with analytical data store optimizations, querying and managing them. · Experience with alternate data engineering tools (Apache Flink, Apache Spark etc) · Automated CI/CD or operations concerns on the engineering platforms. · Interpreting problems from functional context and transforming them into technology solutions. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Total Expereince: 5 to8 years Location- Pune & Hyderabad Notice Period- Immediate to 30 days Must have: · 5+ years of experience in data engineering technology and tools. · Preferred having experience with Java / Scala based implementations for enterprise-wide platforms. · Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. · Complex state-full processing of events with partitioning for higher throughputs. · Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. · Experience with analytical data store optimizations, querying and managing them. · Experience with alternate data engineering tools (Apache Flink, Apache Spark etc) · Automated CI/CD or operations concerns on the engineering platforms. · Interpreting problems from functional context and transforming them into technology solutions. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
GCP Data Engineer On-Premises to Cloud SQL Migration Experience- 5-8 Yrs Location- Noida Notice period - Immediate /serving Work Mode: WFO/Remote/Hybrid (depends on client’s ask) Budget: open budget DATA ENGINEER IV 5-8 years of experience GCP Data Engineer - On-Premises to Cloud SQL Migration Job Description: • As a Data Engineer with a focus on migrating on-premises databases to Google Cloud SQL, you will play a critical role in solving complex problems and creating value for our business by ensuring reliable, scalable, and efficient data migration processes. You will be responsible for architecting, designing and implementing custom pipelines on the GCP stack to facilitate seamless migration. Required Skills: •5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets. •Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence. •Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources. • Strong collaboration with analysts and business process owners to translate business requirements into technical solutions. •Proficiency in coding with scripting languages (Shell scripting, Python, SQL). •Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, Big Query, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc. •Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code. •Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK. Qualifications: •Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience. •GCP Certified Data Engineer (preferred). •Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams, and business audiences .Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Required Experience : 4+ years in to IT with atleast 3 +years into Data Engineer role Responsibilities: Proficient in GCP BigQuery -Dataset creation – Schemas, tables, materialized views -Proficiency in data processing from Pub/Sub to BigQuery to Analytics Hub. -Designing data schemas to align with BigQuery-native structures. -Optimization or testing for production-level loads. -Publish transaction data into BigQuery. Required Skills : GCP, BigQuery, Dataflow, SQL Show more Show less
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Description Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements. Requirements BA / BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, Python 6-10 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Job responsibilities Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL / ELT and reporting / analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS / Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and / or agile methodologies Google Data Engineer certified What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less
Posted 3 weeks ago
6.0 years
0 Lacs
India
Remote
Role: GCP Data Engineer Experience: 6+ years Type: Contract Duration: 6 months Location: Remote Time zone: IST Shift Job Description: We are looking for a skilled GCP Data Engineer with strong expertise in SQL and Python coding . The ideal candidate will have hands-on experience with Google Cloud Platform (GCP) services, especially BigQuery , and will be responsible for designing, building, and optimizing data pipelines and analytics solutions. Key Skills: Strong proficiency in SQL and Python Experience with GCP services, especially BigQuery Data pipeline development and ETL processes Good understanding of data warehousing and data modeling Nice to Have: Experience with Cloud Functions, Dataflow, or Pub/Sub Exposure to CI/CD and DevOps practices on GCP Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Platform Lead / Sr Manager, Platform Engineer As the Data Platform Lead/ Sr Manager, Platform Engineer , you will be responsible for leading the design, scalability, and reliability of the enterprise data analytics platform. You will drive secure, efficient, and resilient data transfer and platform enablement services across a global organization. This role is pivotal in ensuring that analytics-ready data is accessible, governed, and delivered at scale to support decision-making, reporting, and advanced data products—particularly in high-volume, fast-paced environments like retail or QSR. Who we’re looking for: Primary Responsibilities: Platform Strategy & Operations Architect and manage scalable batch processing and data transfer pipelines to serve enterprise-wide analytics use cases. Continuously monitor platform health and optimize for performance, cost efficiency, and uptime. Implement platform observability, diagnostics, and incident response mechanisms to maintain service excellence. Security, Compliance & Governance Ensure secure handling of data across ingestion, transfer, and processing stages, adhering to enterprise and regulatory standards. Establish protocols for secure, compliant, and auditable data movement and transformation. Enablement & Support Provide Level 2/3 technical support for platform services, minimizing disruption and accelerating issue resolution. Drive user enablement by leading documentation efforts, publishing platform standards, and hosting training sessions. Collaborate with key business stakeholders to improve platform adoption and usability. Collaboration & Delivery Partner with product, engineering, and analytics teams to support data initiatives across domains and markets. Ensure the platform supports reliable analytics workflows through automation, integration, and governed data access. Oversee platform releases, upgrades, and maintenance activities to ensure minimal downtime and seamless user experience. Skills: 8+ years of experience in platform engineering, data infrastructure, or analytics technology environments. Deep expertise in: Batch processing and data orchestration tools (e.g., Airflow, Dataflow, Composer) Secure data transfer protocols (e.g., SFTP, API-based, event streaming) Advanced SQL for diagnostics and analytics enablement Python for automation, scripting, and platform utilities Experience with cloud-native platforms (preferably GCP or AWS), including infrastructure-as-code (leveraging Terraform and Ansible) and DevOps tooling. Proven leadership of cross-functional technical teams delivering high-scale, high-availability platform solutions. Excellent collaboration, communication, and stakeholder engagement skills. Bachelor's or Master’s degree in Computer Science, Information Systems, or a related technical field. GCP/AWS certification is preferred Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We’re seeking a hands-on Platform Engineer to support our enterprise data integration and enablement platform. As a Platform Engineer III, you’ll be responsible for designing, maintaining, and optimizing secure and scalable data movement services—such as batch processing, file transfers, and data orchestration. This role is essential to ensuring reliable data flow across systems to power analytics, reporting, and platform services in a cloud-native environment. Who we’re looking for: Primary Responsibilities: Hands-On Data Integration Engineering Build and maintain data transfer pipelines, file ingestion processes, and batch workflows for internal and external data sources. Configure and manage platform components that enable secure, auditable, and resilient data movement. Automate routine data processing tasks to improve reliability and reduce manual intervention. Platform Operations & Monitoring Monitor platform services for performance, availability, and failures; respond quickly to disruptions. Tune system parameters and job schedules to improve throughput and processing efficiency. Implement logging, metrics, and alerting to ensure end-to-end observability of data workflows. Security, Compliance & Support Apply secure protocols and encryption standards to data transfer processes (e.g., SFTP, HTTPS, GCS/AWS). Support compliance with internal controls and external regulations (e.g., GDPR, SOC2, PCI). Collaborate with security and infrastructure teams to manage access controls, service patches, and incident response. Troubleshooting & Documentation Investigate and resolve issues related to data processing failures, delays, or quality anomalies. Document system workflows, configurations, and troubleshooting runbooks for team use. Provide support for platform users and participate in on-call rotations as needed. Skill: 5+ years of hands-on experience in data integration , platform engineering , or infrastructure operations . Proficiency in: Designing and supporting batch and file-based data transfers Python scripting and SQL for diagnostics, data movement, and automation Terraform scripting and deploying of infrastructure cloud services Working with GCP (preferred) or AWS data analytics services, such as: GCP: Cloud Storage, BigQuery, Cloud Composer, Pub / Sub, Dataflow AWS: S3, Glue, Redshift, Athena, Lambda, EventBridge, Step Functions Cloud-native storage and compute optimization for data movement and processing Infrastructure-as-code and CI/CD practices (e.g., Terraform, Ansible, Cloud Build, GitHub Actions) Strong analytical and debugging skills for troubleshooting issues in distributed, high-volume environments. Bachelor's degree in computer science, Information Systems, or a related technical field. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Platform Engineering & Enablement Services Lead / Sr Manager, Platform Engineer As the Platform Engineering & Enablement Services Lead/ Sr. Manager you will be responsible for driving the reliability, performance, and enablement of platform services that support secure data operations and analytics workflows across the enterprise. You will lead a team that designs, builds, and supports the infrastructure and tools necessary for seamless data movement, integration, and user enablement—ensuring agility, compliance, and operational excellence at scale. Who we’re looking for: Primary Responsibilities: Platform Operations & Engineering Design, implement, and maintain high-performing batch processing and file transfer systems across the enterprise. Monitor and tune platform performance to ensure availability, reliability, and cost-effectiveness. Partners with IT and infrastructure teams to plan and coordinate system updates, patching, and platform upgrades. Enablement & User Support Develop platform enablement services including onboarding, documentation, knowledge bases, and office hours. Facilitate training sessions and create self-service models to improve platform usability and adoption. Provide Tier 2/3 support for platform-related issues and coordinate escalations where needed. Security & Governance Ensure platform services meet internal and external security, privacy, and compliance standards (e.g., SOC2, PCI, GDPR). Manage secure data transfer protocols and implement robust access control mechanisms. Cross-Functional Collaboration Act as a liaison between platform engineering, data analytics, architecture, and security teams to ensure alignment. Drive platform automation, CI/CD enablement, and developer productivity initiatives across teams. Documentation & Standards Maintain comprehensive documentation of platform configurations, best practices, and incident runbooks. Standardize engineering processes to support consistency, observability, and ease of scaling across environments. Skill: 8+ years of experience in platform engineering, infrastructure operations, or enterprise systems enablement. Strong technical expertise in: Batch processing frameworks and orchestration tools (e.g., Airflow, Composer, Dataflow) Proficiency in infrastructure scripting for cloud services e.g. Terraform Secure data movement protocols (e.g., SFTP, APIs, messaging queues) Infrastructure engineering, networking, and system integration Proficiency in SQL and Python for troubleshooting, scripting, and platform automation Demonstrated experience in designing and supporting platform services in cloud-native environments (GCP or AWS preferred). Proven leadership in managing technical teams and platform support functions. Excellent communication, cross-team collaboration, and stakeholder management skills. Bachelor's or Master's degree in Computer Science, Information Systems, or related field. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Platform Support / Platform Engineer III As a Data Platform Support Engineer, you will design, implement, and maintain scalable and secure platform services that enable efficient data transfers, API integrations, and platform operations across the enterprise. You will play a critical role in ensuring the performance, reliability, and security of data pipelines and platform services that power decision-making, reporting, and advanced data products—particularly in high-volume, fast-paced global environments. Who we’re looking for: Primary Responsibilities: Platform Operations & Support: Design, implement, and maintain secure and efficient platform solutions to support APIs, data transfers, and analytics workflows. Monitor platform health, optimize for performance and uptime, and implement diagnostic and observability practices. Provide Level 2/3 technical support for platform services to minimize disruption and accelerate issue resolution. Support platform upgrades, releases, and maintenance activities to ensure seamless user experiences. Security, Compliance & Governance: Ensure secure handling of data across transfer and integration processes, adhering to enterprise security and compliance standards. Implement and manage data transfer protocols (e.g., SFTP, API-based integrations) in a secure and auditable manner. Enablement & Documentation: Contribute to platform documentation, standards, and usage guidelines to drive user enablement and platform adoption. Support training sessions and user onboarding for platform services and tools. Collaboration & Delivery: Partner with engineering, product, and analytics teams to support platform initiatives and enhance system integration. Collaborate on continuous improvement initiatives to support analytics-ready data delivery across domains and markets. Skills: 5+ years of experience in platform engineering, data infrastructure, or API/data transfer operations. Deep expertise in: Data orchestration and batch processing tools (e.g., Airflow, Dataflow, Composer preferred). Secure data transfer protocols and integration patterns (e.g., SFTP, APIs, event-driven transfers). SQL for platform diagnostics and performance analytics. Python for automation, scripting, and platform utilities. Hands-on experience with cloud-native environments (preferably GCP or AWS), infrastructure-as-code (e.g., Terraform, Ansible), and DevOps practices. Strong problem-solving and analytical skills with a focus on operational excellence. Demonstrated ability to collaborate across cross-functional technical teams. Strong communication and stakeholder engagement skills. Bachelor’s degree in Computer Science, Information Systems, or a related technical field. GCP/AWS certification is preferred Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Your contributions to organisation's growth: Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics. Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud. Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions. Translate business and customer needs into data collection, preparation and processing requirements. Optimize the performance of algorithms developed by Data Scientists. General administration and monitoring of the data platforms. Competencies: Working with structured & unstructured data. Experienced in various database technologies (RDBMS, OLAP, Timeseries, etc.). Solid programming skills (Python, SQL, Scala is a plus). Experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark). Proficient in PowerBI. Experienced working with APIs. Proficient in security best practices. Data centered Azure know-how is a plus (Storage, Networking, Security, Billing). Expertise you have to bring in along with; Bachelor or Master degree in business informatics, computer science, or equal. A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirable. Extensive experience in handling large data sets. Experience working at least 5 years as a data engineer, preferably in an industrial company. Analytical problem-solving skills and the ability to assimilate complex information. Programming experience in modern data-oriented languages (SQL, Python). Experience with Apache Spark and DevOps. Proven ability to synthesize complex data advanced technical skills related to data modelling, data mining, database design and performance tuning. English language proficiency. Special requirements: High quality mindset paired with strong customer orientation, critical thinking, and attention to detail. Understanding of data processing at scale Influence without authority. Willingness to acquire additional system/technical knowledge as needed. Problem solver. Experience to work in an international organization and in multi-cultural teams. Proactive, creative and innovative. Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About the Job: Developers at Vendasta work in teams, working with Product Managers and Designers in the creation of new features and products. Our Research and Development department works hard to help developers learn, grow, and experiment while at work. With a group of over 100 developers, we have fostered an environment that provides developers with the opportunity to continuously learn from each other. The ideal candidate will demonstrate that they are bright and can tackle tough problems while being able to communicate their solution to others. They are creative and can mix technology with the customer’s problems to find the right solution. Lastly, they are driven and will motivate themselves and others to get things done. As an experienced Software Developer, we expect that you will grow into a thought leader at Vendasta, driving better results across our development organization. Roles & Responsibilities: Develop software in teams of 3-5 developers, with the ability to take on tasks for the team and independently work on them to completion. Follow best practices to write clean, maintainable, scalable, and tested software. Contribute to the best engineering practices, including the use of design patterns, CI/CD, maintainable and scalable code, code review, and automated tests. Provide inputs for a technical roadmap for the Product Area. Ensure that the NFRs and technical debt get their due focus. Work collaboratively with Product Managers to design solutions (including technical roadmap) that help our Partners connect digital solutions to small and medium-sized businesses. Analyzing and improving current system integrations and migration strategies. Interact and collaborate with our high-quality technical team across India and Canada Qualifications: 8+ years experience in a related field with at least 5+ years as full stack developer in an architect or senior development role Experience or strong understanding of high scalability, data-intensive, distributed Internet applications Software development experience including building distributed, microservice-style and cloud-based application architectures Proficiency in modern software language, and willingness to quickly learn our technology stack Preference will be given to candidates with a strong Go (programming language) experience, and who can demonstrate the ability to build and adapt web applications using Angular. Experience in designing, Building and Implementing cloud-native architectures (GCP preferred). Experience working with the Scrum framework Technologies We Use: Cloud Native Computing using Google Cloud Platform BigQuery, Cloud Dataflow, Cloud Pub/Sub, Google Data Studio, Cloud IAM, Cloud Storage, Cloud SQL, Cloud Spanner, Cloud Datastore, Google Maps Platform, Stackdriver, etc.… We have been invited to join the Early Access Program on quite a few GCP technologies. GoLang, Typescript, Python, JavaScript, HTML, Angular, GRPC, Kubernetes Elasticsearch, MySQL, PostgreSQL About Vendasta: So what do we do? We create an entire platform full of digital products & solutions that help small to medium-sized businesses (SMBs) have a stronger presence online through digital advertising, online listings, reputation management, website creation, social media marketing … and much more! Our platform is used exclusively by channel partners, who sell products and services to SMBs, allowing them to leverage us to scale and grow their business. We are trusted by 65,000+ channel partners, serving over 6 million SMBs worldwide! Perks: Stock options (as per policy) Benefits - Health insurance Paid time off Public transport reimbursement Flex days Training & Career Development - Professional development plans, leadership workshops, mentorship programs, and more! Free Snacks, hot beverages, and catered lunches on Fridays Culture - comprised of our core values: Drive, Innovation, Respect, and Agility Provident Fund Show more Show less
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our organization is currently seeking a skilled Senior Data Engineer to become part of our dynamic team. In this role, as a Senior Data Engineer, you will focus on projects involving data integration and ETL for cloud-based platforms. Your responsibilities will include crafting and executing sophisticated data solutions to ensure the data remains precise, dependable, and readily available. Responsibilities Design and execute sophisticated data solutions on cloud-based platforms Create ETL processes utilizing SQL, Python, and additional relevant technologies Maintain data precision, reliability, and accessibility for all stakeholders Work collaboratively with interdisciplinary teams to meet data integration needs and specifications Produce and update documentation such as technical specifications, data flow diagrams, and data mappings Enhance and oversee data integration processes to boost performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects involving data integration and ETL for cloud-based platforms. Your tasks will include creating and executing sophisticated data solutions, ensuring the integrity, reliability, and accessibility of data. Responsibilities Create and execute sophisticated data solutions for cloud-based platforms Construct ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Enhance data integration processes for performance and efficiency, upholding data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Exceptional problem-solving abilities and meticulous attention to detail Strong verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects involving data integration and ETL for cloud-based platforms. Your tasks will include creating and executing sophisticated data solutions, ensuring the integrity, reliability, and accessibility of data. Responsibilities Create and execute sophisticated data solutions for cloud-based platforms Construct ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Enhance data integration processes for performance and efficiency, upholding data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Exceptional problem-solving abilities and meticulous attention to detail Strong verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
India
Remote
🔍 We're Hiring! – ML Ops Engineer (Remote, India) 📍 Location: Remote (Within India) 💼 Employment Type: Full-Time / Contractor 📅 Start Date: Immediate 🕒 Working Hours: 1:30 PM IST – 10:30 PM IST (Aligned with US CST) 🚀 Join Madlabs Global LLC as we lead the charge in deploying cutting-edge ML and Generative AI solutions at scale! We’re looking for a highly skilled ML Ops Engineer to lead the development, deployment, and lifecycle management of AI/ML models in cloud-native (preferably GCP) environments . 💼 Key Responsibilities Build scalable ML pipelines: ingestion, preprocessing, training, and serving. Collaborate with Data Scientists to turn prototypes into production-ready systems. Deploy and optimize LLM-based applications (instruction-tuned, fine-tuned models). Own continuous learning pipelines: retraining, model drift detection, performance tuning. Automate workflows using CI/CD , MLFlow , orchestration tools. Leverage GCP services like Vertex AI, BigQuery, Dataflow, Pub/Sub, Cloud Functions. Use Docker & Kubernetes to containerize and orchestrate model deployments. Monitor model performance with Prometheus, TensorBoard, Grafana, etc. Ensure security, fairness, and compliance across ML systems. 🧠 Required Experience 8+ years in ML Engineering, MLOps, or AI Infrastructure roles. Strong coding skills in Python with frameworks like TensorFlow, PyTorch, Scikit-learn. Deep expertise in GCP-native ML stacks . Hands-on experience in Generative AI model deployment and model optimization . Proficiency in Docker, Kubernetes, Jenkins, GitLab CI/CD . Solid understanding of model monitoring, versioning, rollback, and governance. 🕘 Work Hours Fully remote (India-based) Must provide overlap with CST time zone – working hours: 1:30 PM IST to 10:30 PM IST 💬 Interested or want to learn more? 📞 Contact: +91 98868 11767 📧 Email: naveed@madlabsinfotech.com Apply now or DM us to explore this opportunity to work with a team pushing the boundaries of AI innovation! #Hiring #MLOps #MachineLearning #GenerativeAI #LLM #VertexAI #RemoteJobsIndia #DataEngineering #AIJobs #GCP #DevOpsForAI #MLDeployment #LinkedInJobs Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.
These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.
The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.
In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.
In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.
As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2