Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Software: fuel for mobility We bring bold digital visions to life. So we’re on the lookout for more curious and creative engineers who want to create change – one line of high-quality code at a time. Our transformation isn't for everyone, but if you're excited about solving the leading-edge technological challenges facing the auto industry, then let’s talk about your next move. Let's introduce ourselves At Volvo Cars, curiosity, collaboration, and continuous learning define our culture. Join our mission to create sustainable transportation solutions that protect what matters most – people, communities, and the planet. As a Data Engineer, you will drive digital innovation, leading critical technology initiatives with global teams. You’ll design and implement solutions impacting millions worldwide, supporting Volvo’s vision for autonomous, electric, and connected vehicles. What You'll Do Technical Leadership & Development Lead development and implementation using AirFlow, Amazon Web Services (AWS), Azure, Azure Data Factory (ADF), Big Data and Analytics, Core Data, Data Analysis, ETL/ELT, PowerBI, SQL / SQL Script, Snowflake Design, build, and maintain scalable solutions supporting global operations Collaborate closely with USA stakeholders across product management and engineering Promote technical excellence through code reviews, architecture decisions, and best practices Cross-Functional Collaboration Partner internationally using Microsoft Teams, Slack, SharePoint, and Azure DevOps Participate in Agile processes and sprint planning Share knowledge and maintain technical documentation across regions Support 24/7 operations through on-call rotations and incident management Innovation & Continuous Improvement Research emerging technologies to enhance platform capabilities Contribute to roadmap planning and architecture decisions Mentor junior team members and encourage knowledge sharing What You'll Bring Professional Experience 4 -8 years hands-on experience in software development, system administration, or related fields Deep expertise in AirFlow, AWS, Azure, ADF, Big Data, Core Data, Data Analysis, ETL/ELT, PowerBI, SQL, Snowflake with proven implementation success Experience collaborating with global teams across time zones Preferred industry knowledge in automotive, manufacturing, or enterprise software Technical Proficiency Advanced skills in core technologies: AirFlow, AWS, Azure, ADF, Big Data, Core Data, Data Analysis, ETL/ELT, PowerBI, SQL, Snowflake Strong grasp of cloud platforms, DevOps, and CI/CD pipelines Experience with enterprise integration and microservices architecture Skilled in database design and optimization with SQL and NoSQL Essential Soft Skills Analytical Thinking, Collaboration, Communication Skills, Critical Thinking, Documentation Best Practices, Problem Solving, Written Communication Excellent communication, able to explain complex technical topics Adaptable in multicultural, globally distributed teams Strong problem-solving abilities Additional Qualifications Business-level English fluency Flexibility to collaborate across USA time zones Volvo Cars – driving change together Volvo Cars’ success is the result of a collaborative, diverse and inclusive working environment. Today, we’re one of the most well-known and respected car brands, with around 43,000 employees across the globe. At Volvo Cars, your career is designed around your skills and aspirations, so you can reach your fullest potential. And it’s so exciting – we’re well on our way on our journey towards full electrification. We have five fully electric cars already on the market, and five more on the way. Our fully-electric and plug-in hybrid cars combined make up almost 50 per cent of our sales. So come and join us in shaping the future of mobility. There’s never been a more rewarding time to play your part in our inspiring and creative teams!
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Noida
On-site
5 - 7 Years 2 Openings Noida Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Skills Cloud Platforms ( AWS, MS Azure, GC etc.) Containerization and Orchestration ( Docker, Kubernetes etc..) APIs - Change APIs to APIs development Data Pipeline construction using languages like Python, PySpark, and SQL Data Streaming (Kafka and Azure Event Hub etc..) Data Parsing ( Akka and MinIO etc..) Database Management ( SQL and NoSQL, including Clickhouse, PostgreSQL etc..) Agile Methodology ( Git, Jenkins, or Azure DevOps etc..) JS like Connectors/ framework for frontend/backend Collaboration and Communication Skills Aws Cloud,Azure Cloud,Docker,Kubernetes About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
0 years
2 - 9 Lacs
Noida
On-site
The Senior Technical Lead in CRM / D365 CE OOTB, Configuration, Cust will be responsible for overseeing and leading technical teams to deliver high-quality CRM / D365 CE solutions. The main objective is to ensure the successful implementation, customization, and configuration of CRM / D365 CE OutoftheBox functionalities to meet the specific needs of the organization. (1.) Key Responsibilities 1. Lead and manage technical teams in the design, development, and implementation of crm / d365 ce solutions. 2. Define and implement best practices for crm / d365 ce outofthebox configurations and customizations. 3. Collaborate with stakeholders to gather requirements and provide technical expertise in crm / d365 ce solution design. 4. Perform system analysis, troubleshooting, and debugging to ensure smooth operation of crm / d365 ce systems. 5. Provide guidance and mentorship to junior team members to enhance their technical skills and capabilities. 6. Stay updated on the latest crm / d365 ce trends, updates, and features to propose innovative solutions. Skill Requirements 1. Strong proficiency in crm / d365 ce outofthebox functionalities, configurations, and customizations. 2. Extensive experience in leading technical teams and managing crm / d365 ce implementation projects. 3. In-depth knowledge of crm / d365 ce architecture, data models, and integration capabilities. 4. Excellent problem-solving skills and ability to analyze complex crm / d365 ce issues. 5. Strong communication skills to effectively collaborate with cross functional teams and stakeholders. 6. Ability to prioritize tasks, meet deadlines, and deliver high-quality crm / d365 ce solutions. Certifications: Microsoft Certified: Dynamics 365 Customer Service Functional Consultant Associate or similar certifications preferred. No. of Positions 1 Skill (Primary) Microsoft Dynamics (APPS)-Customer Engagement-Technical-MsD-Microsoft Dynamics 365 Auto req ID 1589518BR Skill Level 3 (Secondary Skill 1) Data Fabric-Azure-Azure Data Factory (ADF) Skill Level 3 (Secondary Skill 2) Microsoft Dynamics (APPS)-MsD-General-Tools and Standards-SSIS/KingswaySoft Skill Level 3 (Secondary Skill 3) Technical Skills (APPS)-Datawarehouse-Extract Transform Load (ETL) Automation Skill Level 3 (Secondary Skill 4) Technical Skills (APPS)-Databases-RDBMS-Microsoft SQL Server
Posted 1 week ago
8.0 years
18 - 30 Lacs
Thiruvananthapuram, Kerala
On-site
Designation: Senior Dot Net Developer Qualification: Any UG / PG Degree / Engineering Graduates Experience: Minimum 8+ Years Gender: Male / Female Job Location: Trivandrum / Kochi (KERALA) Job Type: Full Time | Day Shift | Sat & Sun Week Off Working Time: 12:01 PM to 9:00 PM Project: European client | Shift: Mid Shift (12:01PM TO 9:00PM) | WFO Job Description: Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Primary Skills: 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Job Types: Full-time, Permanent Pay: ₹1,800,000.00 - ₹3,000,000.00 per year Benefits: Cell phone reimbursement Food provided Health insurance Internet reimbursement Paid sick time Paid time off Provident Fund Location Type: In-person Schedule: Day shift Monday to Friday Work Location: In person Speak with the employer +91 9489357211
Posted 1 week ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role: Senior Data Engineer Location: Kochi / Trivandrum /Bangalore Experience: 5+years ( Total - 5yrs and Relevant - 5 yrs) Manadatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, ADF Lead experience is an added Advantage. Interested Candidates please send their Resume to: gigin.raj@greenbayit.com 8943011666
Posted 1 week ago
2.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary: The metrics insights and analytics team is responsible for building dashboards and analytical solutions using AI ML based on requirements from business . Provide predictive and prescriptive analytics based using various delivery execution parameters and give actionable insights to users. Automate processes using new age machine learning algorithms. Key Roles and Responsibilities: • Conceptualize, maintain, automate dashboards as per the requirements • Automation of existing processes to improve productivity and time to market • Enable decision making and action plan identification through Metrics analytics • Conduct training and presentations • Connect with various stakeholders to understand business problems and provide solutions • Bring new age solutions and techniques into the way of working Skills: • Minimum 2-5 years of work experience on power BI dashboards/ TABLEAU and python • Minimum 2-5 years of work experience on AI/ML development • Strong Analytical skills, adept in solutioning & problem solving, Inclination towards numbers • Experience of working on Text analytics, NLP • Experienced in data cleansing, pre-processing data and exploration data analysis • Knowledge on Azure ADF, excel MACRO, RPA will be an advantage • Able to perform feature engineering, normalize data and build correlation maps • Proficient in SQL • Hand-on experience in model operationalization and pipeline management • Capable of working with global teams • Good presentation and training skills LTIMindtree https://www.ltimindtree.com/ is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree — a Larsen & Toubro Group company — combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit www.ltimindtree.com https://www.ltimindtree.com/ . DEI Statement: LTIMindtree is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, ethnicity, nationality, gender, gender-identity, gender expression, language, age, sexual orientation, religion, marital status, veteran status, socio-economic status, disability, or any other characteristic protected by applicable law.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Roles & Responsibilities Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements Hands-on experience with Microsoft Fabric, including Lakehouse, Data Factory, and Synapse. Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Let’s grow together.
Posted 1 week ago
8.0 - 14.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Tips: We are hiring for Data Engineer Role. Experience:8-14 Years Locations: Pune, Chennai Notice Period: Immediate Joiners Responsibilities Mandatory Skills: Python, Pyspark, Databricks, Unity Catalog, DLT (Delta Live Tables), Databricks Workflows, Azure/AWS cloud, ADF/Orchestrator, CI/CD. Qualifications B.Tech, M.Tech, B.E., B.Com, B.Sc, B.A, MBA
Posted 1 week ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role: Senior Data Engineer Location: Kochi / Trivandrum /Bangalore Experience: 5+years Manadatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, SSIS, ADF , Lead experience Start Date: Aug 6 ,2025 Salary- 18 to 23 lpa Job Purpose (both Onsite / Offshore) Responsible for delivering senior-level innovative, compelling, coherent software solutions for our consumer, internal operations and value chain constituents across a wide variety of enterprise applications through the creation of discrete business services and their supporting components. Job Specification / Skills and Competencies Designs, develops and delivers solutions that meet business line and enterprise requirements. Participates in rapid prototyping and POC development efforts. Advances overall enterprise technical architecture and implementation best practices. Assists in efforts to develop and refine functional and non-functional requirements. Participates in iteration and release planning. Informs efforts to develop and refine functional and non-functional requirements. Demonstrates knowledge of, adherence to, monitoring and responsibility for compliance with state and federal regulations and laws as they pertain to this position. Strong ability to produce high-quality, properly functioning deliverables the first time. Delivers work product according to established deadlines. Estimates tasks with a level of granularity and accuracy commensurate with the information provided. Works collaboratively in a small team. Excels in a rapid iteration environment with short turnaround times. Deals positively with high levels of uncertainty, ambiguity, and shifting priorities. Accepts a wide variety of tasks and pitches in wherever needed. Constructively presents, discusses and debates alternatives. Takes shared ownership of the product. Communicates effectively both verbally and in writing. Takes direction from team leads and upper management. Ability to work with little to no supervision while performing duties. Proficient in SSIS & ADF Strong in MS SQL Hands on experience in Data Lake Hands-on experience in data mart and data warehousing including variant schemas (Star, Snowflake). 5+ years of experience with advanced queries, stored procedures, views, triggers, etc. 5+ years of experience of performance tuning queries. 5+ years of experience of both DDL and DML. 5+ years of experience designing enterprise database systems using Microsoft SQL Server/Azure SQL preferred. Experience in Lakehouse architecture preferred. Experience with Cloud technologies – AWS, Snowflake is preferred. Deep understanding of one or more source/version control systems. Develops branching and merging strategies. Working understanding of Web API, REST, JSON. Working understanding of unit testing creation. Bachelor’s Degree is required, and/or a minimum of four (4) + related work experience. To adhere to the Information Security Management policies and procedures.
Posted 1 week ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Description Role: Senior Data Engineer Location: Kochi / Trivandrum /Bangalore Experience: 5+years Mandatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, SSIS, ADF , Lead experience Start Date: Aug 6 ,2025 Salary- 18 to 23 LPA Job Purpose (both Onsite / Offshore) Responsible for delivering senior-level innovative, compelling, coherent software solutions for our consumer, internal operations and value chain constituents across a wide variety of enterprise applications through the creation of discrete business services and their supporting components. Job Specification / Skills and Competencies 1. Designs, develops and delivers solutions that meet business line and enterprise requirements. 2. Participates in rapid prototyping and POC development efforts. 3. Advances overall enterprise technical architecture and implementation best practices. 4. Assists in efforts to develop and refine functional and non-functional requirements. 5. Participates in iteration and release planning. 6. Informs efforts to develop and refine functional and non-functional requirements. 7. Demonstrates knowledge of, adherence to, monitoring and responsibility for compliance with state and federal regulations and laws as they pertain to this position. 8. Strong ability to produce high-quality, properly functioning deliverables the first time. 9. Delivers work product according to established deadlines. 10. Estimates tasks with a level of granularity and accuracy commensurate with the information provided. 11. Works collaboratively in a small team. 12. Excels in a rapid iteration environment with short turnaround times. 13. Deals positively with high levels of uncertainty, ambiguity, and shifting priorities. 14. Accepts a wide variety of tasks and pitches in wherever needed. 15. Constructively presents, discusses and debates alternatives. 16. Takes shared ownership of the product. 17. Communicates effectively both verbally and in writing. 18. Takes direction from team leads and upper management. 19. Ability to work with little to no supervision while performing duties. 20. Proficient in SSIS & ADF 21. Strong in MS SQL 22. Hands on experience in Data Lake 23. Hands-on experience in data mart and data warehousing including variant schemas (Star, Snowflake). 24. 5+ years of experience with advanced queries, stored procedures, views, triggers, etc. 25. 5+ years of experience of performance tuning queries. 26. 5+ years of experience of both DDL and DML. 27. 5+ years of experience designing enterprise database systems using Microsoft SQL Server/Azure SQL preferred. 28. Experience in Lakehouse architecture preferred. 29. Experience with Cloud technologies – AWS, Snowflake is preferred. 30. Deep understanding of one or more source/version control systems. Develops branching and merging strategies. 31. Working understanding of Web API, REST, JSON. 32. Working understanding of unit testing creation. 33. Bachelor’s Degree is required, and/or a minimum of four (4) + related work experience. 34. To adhere to the Information Security Management policies and procedures.
Posted 1 week ago
10.0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
HCLTech is looking for a passionate and experienced Azure Data Engineer to join our growing team. If you have strong hands-on experience with Azure Data Factory , Azure Databricks , and Oracle , and are excited to work on impactful data projects, we want to hear from you! 🔹 We're Hiring: Azure Data Engineer 📍 Location: Lucknow 🏢 Company: HCLTech 🕒 Shift: Rotational 💼 Project Type: Support / Development 📅 Experience: 5–10 Years 🎯 Customer Interview: Not Required Key Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory and Azure Databricks. Work with Oracle databases for data extraction, transformation, and loading (ETL). Collaborate with cross-functional teams to support and enhance data solutions. Optimize and troubleshoot data workflows and performance issues. Participate in support and development activities across multiple projects. Why Join Us? Work on cutting-edge Azure technologies. Flexible work location with physical presence in Lucknow. Collaborative and growth-oriented environment. No customer interviews – quick onboarding process. 📩 Apply Now Ready to take the next step in your ADF data engineering career? 📧 Drop your resume on sushma-bisht@hcltech.com
Posted 1 week ago
0.0 - 40.0 years
0 Lacs
Gurugram, Haryana
On-site
Additional Locations: India-Haryana, Gurgaon Diversity - Innovation - Caring - Global Collaboration - Winning Spirit - High Performance At Boston Scientific, we’ll give you the opportunity to harness all that’s within you by working in teams of diverse and high-performing employees, tackling some of the most important health industry challenges. With access to the latest tools, information and training, we’ll help you in advancing your skills and career. Here, you’ll be supported in progressing – whatever your ambitions. Senior Software Engineer-MLOps We are looking for a highly skilled Senior Software Engineer – MLOps with deep expertise in building and managing production-grade ML pipelines in AWS and Azure cloud environments. This role requires a strong foundation in software engineering, DevOps principles, and ML model lifecycle automation to enable reliable and scalable machine learning operations across the organization Key Responsibilities include: Design and build robust MLOps pipelines for model training, validation, deployment, and monitoring Automate workflows using CI/CD tools such as GitLab Actions, Azure DevOps, Jenkins, or Argo Workflows Build and manage ML workloads on AWS (SageMaker Unified studio, Bedrock, EKS, Lambda, S3, Athena) and Azure (Azure ML Foundry, AKS, ADF, Blob Storage) Design secure and cost-efficient ML architecture leveraging cloud-native services Manage infrastructure using IaC tools such as Terraform, Bicep, or CloudFormation Implement cost optimization and performance tuning for cloud workloads Package ML models using Docker, and orchestrate deployments with Kubernetes on EKS/AKS Ensure robust CI/CD pipelines and infrastructure as code (IaC) using tools like Terraform or CloudFormation Integrate observability tools for model performance, drift detection, and lineage tracking (e.g., Fiddler, MLflow, Prometheus, Grafana, Azure Monitor, CloudWatch) Ensure model reproducibility, versioning, and compliance with audit and regulatory requirements Collaborate with data scientists, software engineers, DevOps, and cloud architects to operationalize AI/ML use cases Mentor junior MLOps engineers and evangelize MLOps best practices across teams Required Qualification: Bachelor's/Master’s in Computer Science, Engineering, or related discipline 10 years in Devops, with 2+ years in MLOps. Proficient with MLflow, Airflow, FastAPI, Docker, Kubernetes, and Git. Experience with feature stores (e.g., Feast), model registries, and experiment tracking. Proficiency in Devops & MLOps, Automation Cloud formation/Teraform/BICEP Requisition ID: 610750 As a leader in medical science for more than 40 years, we are committed to solving the challenges that matter most – united by a deep caring for human life. Our mission to advance science for life is about transforming lives through innovative medical solutions that improve patient lives, create value for our customers, and support our employees and the communities in which we operate. Now more than ever, we have a responsibility to apply those values to everything we do – as a global business and as a global corporate citizen. So, choosing a career with Boston Scientific (NYSE: BSX) isn’t just business, it’s personal. And if you’re a natural problem-solver with the imagination, determination, and spirit to make a meaningful difference to people worldwide, we encourage you to apply and look forward to connecting with you!
Posted 1 week ago
0.0 - 40.0 years
0 Lacs
Gurugram, Haryana
On-site
Additional Locations: India-Haryana, Gurgaon Diversity - Innovation - Caring - Global Collaboration - Winning Spirit - High Performance At Boston Scientific, we’ll give you the opportunity to harness all that’s within you by working in teams of diverse and high-performing employees, tackling some of the most important health industry challenges. With access to the latest tools, information and training, we’ll help you in advancing your skills and career. Here, you’ll be supported in progressing – whatever your ambitions. Software Engineer-MLOps We are seeking an enthusiastic and detail-oriented MLOps Engineer to support the development, deployment, and monitoring of machine learning models in production environments. This is a hands-on role ideal for candidates looking to grow their skills at the intersection of data science, software engineering, and DevOps. You will work closely with senior MLOps engineers, data scientists, and software developers to build scalable, reliable, and automated ML workflows across cloud platforms like AWS and Azure Key Responsibilities include: Assist in building and maintaining ML pipelines for data preparation, training, testing, and deployment Support the automation of model lifecycle tasks including versioning, packaging, and monitoring Build and manage ML workloads on AWS (SageMaker Unified studio, Bedrock, EKS, Lambda, S3, Athena) and Azure (Azure ML Foundry, AKS, ADF, Blob Storage) Assist with containerizing ML models using Docker, and deploying using Kubernetes or cloud-native orchestrators Manage infrastructure using IaC tools such as Terraform, Bicep, or CloudFormation Participate in implementing CI/CD pipelines for ML workflows using GitHub Actions, Azure DevOps, or Jenkins Contribute to testing frameworks for ML models and data validation (e.g., pytest, Great Expectations). Ensure robust CI/CD pipelines and infrastructure as code (IaC) using tools like Terraform or CloudFormation Participate in diagnosing issues related to model accuracy, latency, or infrastructure bottlenecks Continuously improve knowledge of MLOps tools, ML frameworks, and cloud practices. Required Qualification: Bachelor's/Master’s in Computer Science, Engineering, or related discipline 7 years in Devops, with 2+ years in MLOps. Good Understanding of MLflow, Airflow, FastAPI, Docker, Kubernetes, and Git. Proficient in Python and familiar with bash scripting Exposure to MLOps platforms or tools such as SageMaker Studio, Azure ML, or GCP Vertex AI. Requisition ID: 610751 As a leader in medical science for more than 40 years, we are committed to solving the challenges that matter most – united by a deep caring for human life. Our mission to advance science for life is about transforming lives through innovative medical solutions that improve patient lives, create value for our customers, and support our employees and the communities in which we operate. Now more than ever, we have a responsibility to apply those values to everything we do – as a global business and as a global corporate citizen. So, choosing a career with Boston Scientific (NYSE: BSX) isn’t just business, it’s personal. And if you’re a natural problem-solver with the imagination, determination, and spirit to make a meaningful difference to people worldwide, we encourage you to apply and look forward to connecting with you!
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a seasoned Senior ETL/DB Tester, you will be responsible for designing, developing, and executing comprehensive test plans to validate ETL and database processes. Your expertise in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI will be crucial in ensuring data integrity and accuracy across modern data platforms. Your analytical skills, attention to detail, and ability to collaborate with cross-functional teams in a fast-paced data engineering environment will be key in this role. Your main responsibilities will include validating data transformations and integrity, performing manual testing and defect tracking using tools like Zephyr or Tosca, analyzing business and data requirements for test coverage, and writing complex SQL queries for data reconciliation. You will also be expected to identify data-related issues, conduct root cause analysis in collaboration with developers, and track bugs and enhancements using appropriate tools. In addition, you will optimize testing strategies for performance, scalability, and accuracy in ETL processes. Your skills in ETL tools like Talend, ADF, data platforms like Snowflake, and reporting/analytics tools such as Power BI and VPI will be essential for success in this role. Your expertise in API testing and advanced features of Power BI like Dashboards, DAX, and Data Modelling will further strengthen your testing capabilities. Overall, your role as a Senior ETL/DB Tester will require a combination of technical skills, testing proficiency, and collaboration with various teams to ensure the reliability and accuracy of data processes across different data platforms and tools.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Power BI Developer at InfoBeans, you will play a crucial role in assisting Business analysts in developing, maintaining, and supporting operational/live reports, dashboards, and scorecards using Microsoft Power BI. Your responsibilities will include the implementation of Row Level Security by defining various constraints for each defined ROLE. In this role, you will have the opportunity to work in a dynamic environment alongside smart and pragmatic team members. You will be part of a learning culture that values teamwork, collaboration, diversity, and rewards excellence, compassion, openness, and ownership. Furthermore, you can expect ever-growing opportunities for professional and personal growth. To excel in this role, we expect you to have expertise in Power BI Desktop, mobile, and service development, along with proficiency in MSBI (SSIS, Tabular SSAS, SSRS) with DAX. Your knowledge should also include SQL Server 2012/2014/2016 - TSQL development, Snowflake, Microstrategy, Informatica Power Center, ADF, and other Azure BI Technologies. Your experience in creating dashboards, volume reports, operating summaries, presentations, and graphs will be highly beneficial. Additionally, you should be skilled in SSRS Integration to Power BI, SSAS, Data Gateway for data refreshing, content Pack Library, Managing Embed Codes, Power BI Mobile, and SQL Server versions. As a proficient data visualization expert, you should have strong application development skills and be knowledgeable in Azure. Your expertise in creating calculated measures and columns with DAX in MS Power BI Desktop, Custom Visuals, Groups usage, publishing reports to app.powerbi.com, and setting up necessary connection details will be essential. You should be an expert in connecting Microsoft Power BI Desktop to various data sources, using advanced calculations, and creating different visualizations using a range of tools such as Slicers, Lines, Pies, Histograms, Maps, Scatter, Bullets, Heat Maps, Tree maps, among others. Your proficiency in these areas will be key to your success in this role.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
Are you a creative engineer who loves a challenge Solve the complex puzzles you've been dreaming of as our Customer Success Engineer. If you have a passion for innovation in tech, we want you on our team! Oracle is a technology leader that's changing how the world does business, and our Customer Success Services (CSS) team supports over 6,000 companies around the world. We're looking for a talented and self-motivated engineer to work on-site in our Oracle ME offices. Join the team of highly skilled technical experts who build and maintain our clients" technical landscapes through tailored support services. We are looking for a Principal Fusion HCM Techno-Functional Consultant who will be responsible for providing consultancy, working with customers, translating ideas and concepts into implementable, supportable designs, and have experience in providing technical solutions aligned with Oracle standards. You will also have experience in maintaining and supporting customers" eBusiness Suite applications and Fusion SAAS, either on-site or remotely. Plays a direct role in building, maintenance, technical support, documentation, and administration of Oracle Cloud applications. **What You Will Do** As a Principal Fusion HCM Techno-Functional Consultant in Oracle CSS, you will: - Ability to be a technical team leader and coach team members in relevant skills. Finding ways to recognize the contributions of others in the team. - Assess and analyze customers" business needs to make sure that Oracle solutions meet the customer's objectives. - Assist customers in their overall Journey to Cloud. - Ensure Oracle cloud technologies are leveraged appropriately using best practices. - Be the Oracle Solution Delivery authority to ensure that customers make informed decisions regarding scope to achieve beneficial solutions cost-effectiveness, quality, and reusability. - Providing technical guidance on Oracle cloud and/or on-premise solutions to customers and other Oracle team members to underpin successful delivery. - Support solutions around multi-cloud and hybrid cloud setups. - Ensure successful handover from Implementation toward operations making sure the implemented solution will fit the customer requirements. - Maintain the Oracle Solution to make sure the customer demands needs will be met. Platforms for Oracle solutions are on-premise, cloud, or hybrid running various workloads (application, middleware, database, and infrastructure). - Working closely with the Technical Account Manager to ensure that the individual work streams are technically well managed. - Be the main contact for new business opportunities by supporting our presales team. Identifies and promotes opportunities for sales of Oracle products and services to support business growth. - Actively lead and contribute to strategic programs and initiatives. - To summarize - helping to use and take the best advantage of all the value our company offers to our customers. **What We Are Looking For** - 10+ years of relevant professional experience. Bachelor's degree in computer science, information systems, software engineering, or related field preferred. - Strong experience in implementing Fusion Applications, at least 4 full cycles of successful implementations. - Demonstrate a good understanding of the Fusion quarterly update process and best practices according to new feature adoption, testing, and change management. - Strong knowledge in roles and security. - Proven Experience on Oracle Transactional Business Intelligence (OTBI), dashboards, all types of data loaders, extracts, fast formula, error handling, SOAP services, BPM, personalization, Sandboxes, page composer, etc. - Design and develop customizations using Visual Builder, ADF, and Process Builder in OIC to Oracle ERP Cloud is a plus. For this position, we are looking for a creative, innovative, and motivated professional with an open and flexible mindset who will work closely with the Customer to ensure alignment between business change, IT architecture, technical solutions, business resources, and processes. As an integral part of a global Organization, the Principal HCM Engineer will be working within an international environment with colleagues around the globe and contribute to global technology-driven initiatives or innovation programs for continuous service improvements.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
Dear candidates, ValueLabs is currently looking for a BI Lead with a strong background in Power BI and SQL to join our team at the earliest. The ideal candidate should possess 8-12 years of experience and expertise in Power BI, SQL queries, and Azure Data Factory (ADF). As a Technical Lead, your key responsibilities will include creating engaging and interactive reports and dashboards using Power BI Desktop. You will also be tasked with designing and implementing Power BI data models that seamlessly integrate with various data sources. Additionally, you will be expected to automate report delivery and scheduling using tools like Power Automate. The role will also require you to have experience in team management, excellent communication skills, and the ability to collaborate with business stakeholders to understand their reporting requirements and translate them into actionable insights. You will be responsible for developing and maintaining ETL processes using Azure Data Factory and data warehouses using Azure Synapse Analytics. As a part of the role, you will oversee and manage a team of data engineers, ensuring they meet project deadlines and deliver high-quality work. You will also be responsible for developing and implementing team guidelines, policies, and procedures to enhance productivity and performance. Mentoring and coaching team members to improve their skills and career development will be crucial, along with conducting regular one-on-one meetings to discuss progress, address concerns, and set goals. If you are interested in this position, please submit your resume to deepika.malisetti@valuelabs.com. We encourage you to share this job opportunity with anyone who might benefit from it, and references are highly appreciated. Best regards, ValueLabs Team,
Posted 1 week ago
2.0 - 5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
What will you do: Improve, and maintain Azure-based data warehouse solutions . Implement, monitor, and optimize workflows using Azure Synapse, ADF, and Databricks. Manage relationships with IT vendors to ensure optimal service delivery and performance. Offer the best practices, advice and recommendations to the Managed Services team around the overall architecture and strategy of Azure-based solutions. Act as the liaison between technical teams and business stakeholders to ensure effective service delivery. Collaborate with cloud architects and engineers to optimize cost, performance, and security. Assist with onboarding new Azure services and integrating them into existing operations. Investigate and resolve complex technical issues and bugs, ensuring the stability and reliability of the applications and data warehouse solutions. Operations Work closely with the IT Service Delivery Lead and support teams to manage daily support and maintenance of application instances and conduct long-term improvement operations to ensure compatibility with evolving mission requirements. What you need: Bachelor’s degree required; Master’s degree in computer science or Business Administration preferred 2 to 5 years of experience in Azure Platform (Synapse, ADF, Databricks, Power BI) Microsoft Azure Fundamentals or higher-level Azure certifications (e.g., AZ-104, AZ-305).Strong understanding of Azure services including Azure Virtual Machines, Azure Active Directory, Azure Monitor, and Azure Resource Manager. Experience in IT Service Management (ITSM), data analysis, and business process automation. Ability to develop good working relationships with technical, business, using strong communication and team-building skills. Ability to analyze numbers, trends, and data to make new conclusions based on findings. Ability to work effectively in a matrix organization structure, focusing on collaboration and influence rather than command and control. Stryker is a global leader in medical technologies and, together with its customers, is driven to make healthcare better. The company offers innovative products and services in MedSurg, Neurotechnology, Orthopaedics and Spine that help improve patient and healthcare outcomes. Alongside its customers around the world, Stryker impacts more than 150 million patients annually.
Posted 1 week ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Skills Cloud Platforms ( AWS, MS Azure, GC etc.) Containerization and Orchestration ( Docker, Kubernetes etc..) APIs - Change APIs to APIs development Data Pipeline construction using languages like Python, PySpark, and SQL Data Streaming (Kafka and Azure Event Hub etc..) Data Parsing ( Akka and MinIO etc..) Database Management ( SQL and NoSQL, including Clickhouse, PostgreSQL etc..) Agile Methodology ( Git, Jenkins, or Azure DevOps etc..) JS like Connectors/ framework for frontend/backend Collaboration and Communication Skills Aws Cloud,Azure Cloud,Docker,Kubernetes
Posted 1 week ago
10.0 - 12.0 years
14 - 20 Lacs
Noida
Work from Office
Role & responsibilities Candidate with Database management system knowledge, data modelling in ETL/Snowflakes, solid SQL scripting experience, ADF, data fudging Experience : Total 10 + Yrs Relevant : 5+ Yrs Job Location : Noida Mode : Work From Office Technical Skills: 10+ years of experience on Database Management Systems, configures database parameters and prototype designs against logical data models. Defines data repository requirements, data dictionaries and warehousing requirements. To optimizes database access and allocates/re-allocates database resources for optimum configuration, database performance and cost. ADF Building knowledge and Data Masking Experience/knowledge with Microsoft SQL Replication and CDC technology is must. Experience setting up and configuring HA (High Availability) / Replication/AlwaysOn. Preferably having researched and fine-tuned a setup so that the person understands what/why and can understand our needs and design/implement/configure a setup that meets client needs. Solid SQL scripting experience, in general. *The typical DBA skills / experience including DB maintenance, table/index/etc. maintenance, backups, monitoring, security, data dictionary, integrity checks, configuration, patching, and statistics, etc. Experience with SQL Server 2008/2012/2014/2016, preferably with multiple editions (Standard, Enterprise, etc.). Experience having installed and configured SQL Server instances in a similar capacity Thorough understanding of Performance Tuning both as a System DBA and Application DBA (difference being Application DBA is more about query/application performance tuning and System DBA is about tuning the database itself, how it is configured, etc.). Strong SQL skills Strong Data Modeling skills both Logical/Physical Database Modeling knowledge Strong analytical and problem solving skills Excellent verbal and written communication skills Strong experienced with Microsoft tools and software (Visual Studio 2012/2015, SQL Mgmt. Studio, Microsoft Office, etc.) Experience with Data Warehousing and OLTP database management. Measure, track and meet SLA metrics (analytics cycle time, schedule, accuracy, rework, etc.). Assist in database performance tuning and troubleshooting database issues in both OLTP and EDW environments. Install, configure, and maintain critical SQL Server databases. Dimensional and relational based databases. Supporting internal and customer facing applications Assist in the full range of SQL Server maintenance activities (for example): Backups, restores, recovery models, database shrink operations, DBCC commands, and Replication. Table, Index, etc. design, creation, and maintenance. On-going maintenance activities - working to plan and automate as much as possible. Increase availability and performance while at the same time reducing manual support time. Assist as part of the team designing, building, and maintaining the future state of Merchants database platforms: SQL Server versions, editions, and components. Database server configuration. Database and Data Model standards Business Continuity and High Availability strategy. Overall Data Architecture. Upgrade and Patching strategy. Working on Automation using powershell and T-sql scripts. Knowledge on Python, ARM, Bicep is preferred. Knowledge in snowflakes is preferred. Knowledge in ETL Integration layer ,SSIS and SSRS. To troubleshoot and Fix issues on JAMS JOBS which are running on SSIS, Powershell, T-SQL and Batch files. Knowledge on DevOps and Azure environment Migration of SQL Server and work with application support to support all CDC and ETL Integration Layer. Installation and configuration of SSRS, SSIS and SSAS. Good to have Deep Azure experience Preferred candidate profile Process Skills: General SDLC processes Understanding of utilizing Agile and Scrum software development methodologies Skill in gathering and documenting user requirements and writing technical specifications Behavioral Skills : Good Attitude and Quick learner Well-developed analytical & problem-solving skills Strong oral and written communication skills Excellent team player, able to work with virtual teams Excellent leadership skills with ability lead and guide and groom the team Self-motivated and capable of working independently with minimal management supervision Able to talk to client directly and report
Posted 1 week ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title: Senior Data Engineer Location: Kochi / Trivandrum / Bangalore Experience: 5+ years Salary: ₹18 to ₹23 LPA Notice period: Immediate Joiners Job Summary: We are looking for an experienced Senior Data Engineer to join our team and lead the design, development, and deployment of enterprise data solutions. The ideal candidate is proficient in MS SQL, SSIS, Azure Data Factory, Azure SQL, and Data Lakes and has a strong background in data warehousing, performance tuning, and cloud technologies. Key Responsibilities: Design and implement scalable data solutions using SQL Server, SSIS, ADF, and Azure Lead development of data pipelines, data marts, and Lakehouse architectures Collaborate on rapid prototyping, POCs, and cross-functional project planning Ensure delivery of high-quality, optimized queries, views, and stored procedures Provide mentoring and technical leadership to junior engineers Work in Agile teams to deliver features within deadlines Required Skills: Strong hands-on experience in MS SQL, SSIS, ADF Azure Data Lake and Azure SQL Expertise in Data Mart/Data Warehouse modeling (Star, Snowflake) 5+ years in writing complex queries, stored procedures, performance tuning Experience with cloud platforms (Azure mandatory; AWS/Snowflake preferred) Familiarity with source control, REST APIs, JSON, and unit testing Education: Bachelor’s Degree in Computer Science or related field, or equivalent practical experience Apply Now if you're ready to take the next step in your data engineering career and lead critical data initiatives in a dynamic, tech-driven environment.
Posted 1 week ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a hands-on and motivated Azure DataOps Engineer to support our cloud-based data operations and workflows. This role is ideal for someone with strong foundational knowledge of Azure data services and data pipelines who is looking to grow in a fast-paced environment. You will work closely with senior engineers and analysts to manage data pipelines, ensure data quality, and assist in deployment and monitoring activities. Your Key Responsibilities Support the execution and monitoring of Azure Data Factory (ADF) pipelines and Azure Synapse workloads. Assist in maintaining data in Azure Data Lake and troubleshoot ingestion and access issues. Collaborate with the team to support Databricks notebooks and manage small transformation tasks. Perform ETL operations and ensure timely and accurate data movement between systems. Write and debug intermediate-level SQL queries for data validation and issue analysis. Monitor pipeline health using Azure Monitor and Log Analytics, and escalate issues as needed. Support deployment activities using Azure DevOps pipelines. Maintain and update SOPs, assist in documenting known issues and recurring tasks. Participate in incident management and contribute to resolution and knowledge sharing. Skills And Attributes For Success Strong understanding of cloud-based data workflows, especially in Azure environments. Analytical mindset with the ability to troubleshoot data pipeline and transformation issues. Comfortable working with large datasets and navigating both structured and semi-structured data. Ability to follow runbooks, SOPs, and collaborate effectively with other technical teams. Willingness to learn new technologies and adapt in a dynamic environment. Good communication skills to interact with stakeholders, document findings, and share updates. Discipline to work independently, manage priorities, and escalate issues responsibly. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Working knowledge of Azure Data Factory, Data Lake, and Synapse Exposure to Azure Databricks – ability to understand and run existing notebooks Understanding of ETL processes and data flow concepts Good to have Experience with Power BI or Tableau for basic reporting and data visualization Exposure to Informatica CDI or any other data integration platform Basic scripting knowledge in Python or PySpark for data processing or automation tasks Proficiency in writing SQL for querying and analyzing structured data Familiarity with Azure Monitor and Log Analytics for pipeline monitoring Experience supporting DevOps deployments or familiarity with Azure DevOps concepts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a hands-on and motivated Azure DataOps Engineer to support our cloud-based data operations and workflows. This role is ideal for someone with strong foundational knowledge of Azure data services and data pipelines who is looking to grow in a fast-paced environment. You will work closely with senior engineers and analysts to manage data pipelines, ensure data quality, and assist in deployment and monitoring activities. Your Key Responsibilities Support the execution and monitoring of Azure Data Factory (ADF) pipelines and Azure Synapse workloads. Assist in maintaining data in Azure Data Lake and troubleshoot ingestion and access issues. Collaborate with the team to support Databricks notebooks and manage small transformation tasks. Perform ETL operations and ensure timely and accurate data movement between systems. Write and debug intermediate-level SQL queries for data validation and issue analysis. Monitor pipeline health using Azure Monitor and Log Analytics, and escalate issues as needed. Support deployment activities using Azure DevOps pipelines. Maintain and update SOPs, assist in documenting known issues and recurring tasks. Participate in incident management and contribute to resolution and knowledge sharing. Skills And Attributes For Success Strong understanding of cloud-based data workflows, especially in Azure environments. Analytical mindset with the ability to troubleshoot data pipeline and transformation issues. Comfortable working with large datasets and navigating both structured and semi-structured data. Ability to follow runbooks, SOPs, and collaborate effectively with other technical teams. Willingness to learn new technologies and adapt in a dynamic environment. Good communication skills to interact with stakeholders, document findings, and share updates. Discipline to work independently, manage priorities, and escalate issues responsibly. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Working knowledge of Azure Data Factory, Data Lake, and Synapse Exposure to Azure Databricks – ability to understand and run existing notebooks Understanding of ETL processes and data flow concepts Good to have Experience with Power BI or Tableau for basic reporting and data visualization Exposure to Informatica CDI or any other data integration platform Basic scripting knowledge in Python or PySpark for data processing or automation tasks Proficiency in writing SQL for querying and analyzing structured data Familiarity with Azure Monitor and Log Analytics for pipeline monitoring Experience supporting DevOps deployments or familiarity with Azure DevOps concepts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a hands-on and motivated Azure DataOps Engineer to support our cloud-based data operations and workflows. This role is ideal for someone with strong foundational knowledge of Azure data services and data pipelines who is looking to grow in a fast-paced environment. You will work closely with senior engineers and analysts to manage data pipelines, ensure data quality, and assist in deployment and monitoring activities. Your Key Responsibilities Support the execution and monitoring of Azure Data Factory (ADF) pipelines and Azure Synapse workloads. Assist in maintaining data in Azure Data Lake and troubleshoot ingestion and access issues. Collaborate with the team to support Databricks notebooks and manage small transformation tasks. Perform ETL operations and ensure timely and accurate data movement between systems. Write and debug intermediate-level SQL queries for data validation and issue analysis. Monitor pipeline health using Azure Monitor and Log Analytics, and escalate issues as needed. Support deployment activities using Azure DevOps pipelines. Maintain and update SOPs, assist in documenting known issues and recurring tasks. Participate in incident management and contribute to resolution and knowledge sharing. Skills And Attributes For Success Strong understanding of cloud-based data workflows, especially in Azure environments. Analytical mindset with the ability to troubleshoot data pipeline and transformation issues. Comfortable working with large datasets and navigating both structured and semi-structured data. Ability to follow runbooks, SOPs, and collaborate effectively with other technical teams. Willingness to learn new technologies and adapt in a dynamic environment. Good communication skills to interact with stakeholders, document findings, and share updates. Discipline to work independently, manage priorities, and escalate issues responsibly. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Working knowledge of Azure Data Factory, Data Lake, and Synapse Exposure to Azure Databricks – ability to understand and run existing notebooks Understanding of ETL processes and data flow concepts Good to have Experience with Power BI or Tableau for basic reporting and data visualization Exposure to Informatica CDI or any other data integration platform Basic scripting knowledge in Python or PySpark for data processing or automation tasks Proficiency in writing SQL for querying and analyzing structured data Familiarity with Azure Monitor and Log Analytics for pipeline monitoring Experience supporting DevOps deployments or familiarity with Azure DevOps concepts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a hands-on and motivated Azure DataOps Engineer to support our cloud-based data operations and workflows. This role is ideal for someone with strong foundational knowledge of Azure data services and data pipelines who is looking to grow in a fast-paced environment. You will work closely with senior engineers and analysts to manage data pipelines, ensure data quality, and assist in deployment and monitoring activities. Your Key Responsibilities Support the execution and monitoring of Azure Data Factory (ADF) pipelines and Azure Synapse workloads. Assist in maintaining data in Azure Data Lake and troubleshoot ingestion and access issues. Collaborate with the team to support Databricks notebooks and manage small transformation tasks. Perform ETL operations and ensure timely and accurate data movement between systems. Write and debug intermediate-level SQL queries for data validation and issue analysis. Monitor pipeline health using Azure Monitor and Log Analytics, and escalate issues as needed. Support deployment activities using Azure DevOps pipelines. Maintain and update SOPs, assist in documenting known issues and recurring tasks. Participate in incident management and contribute to resolution and knowledge sharing. Skills And Attributes For Success Strong understanding of cloud-based data workflows, especially in Azure environments. Analytical mindset with the ability to troubleshoot data pipeline and transformation issues. Comfortable working with large datasets and navigating both structured and semi-structured data. Ability to follow runbooks, SOPs, and collaborate effectively with other technical teams. Willingness to learn new technologies and adapt in a dynamic environment. Good communication skills to interact with stakeholders, document findings, and share updates. Discipline to work independently, manage priorities, and escalate issues responsibly. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Working knowledge of Azure Data Factory, Data Lake, and Synapse Exposure to Azure Databricks – ability to understand and run existing notebooks Understanding of ETL processes and data flow concepts Good to have Experience with Power BI or Tableau for basic reporting and data visualization Exposure to Informatica CDI or any other data integration platform Basic scripting knowledge in Python or PySpark for data processing or automation tasks Proficiency in writing SQL for querying and analyzing structured data Familiarity with Azure Monitor and Log Analytics for pipeline monitoring Experience supporting DevOps deployments or familiarity with Azure DevOps concepts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough