Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
5 - 8 Lacs
Hyderabad, Telangana, India
On-site
We are looking for an experienced Azure Data Engineer with strong expertise in Azure Databricks to join our data engineering team. Key Responsibilities: Design and build robust data pipelines and ETL/ELT workflows primarily using Azure Databricks and Azure Data Factory . Ingest, clean, transform, and process large datasets from a wide array of diverse sources, encompassing both structured and unstructured data. Implement Delta Lake solutions to enhance data reliability and performance, and optimize Spark jobs for efficiency and reliability. Integrate Azure Databricks seamlessly with other critical Azure services, including Azure Data Lake Storage, Azure Synapse Analytics, and Azure Event Hubs . Collaborate with data scientists, analysts, and other engineering teams to understand data requirements and deliver scalable data solutions. Monitor, troubleshoot, and optimize data pipeline performance to ensure data accuracy and timely delivery.
Posted 1 month ago
4.0 - 5.0 years
4 - 5 Lacs
Bengaluru, Karnataka, India
On-site
Must have 3+ years of IT experience, relevant experience of at least 1 year in Snowflake. In-depth understanding of Data Warehousing, ETL concepts and modeling structure principles Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe. Person should know Snowflake Architecture Experience in SQL is must. Expertise on engineering platform components such as Data Pipelines, Data Orchestration, Data Quality, Data Governance Analytics Hands-on experience on implementing large-scale data intelligence solution around Snowflake DW Experience in scripting language such as Python or Scala is must Good experience on streaming services such as Kafka Experience working with Semi-Structured data Required Skills: SnowFlake, Snowflake SQL, Snowpipe, SQL
Posted 1 month ago
4.0 - 5.0 years
4 - 5 Lacs
Hyderabad, Telangana, India
On-site
Must have 3+ years of IT experience, relevant experience of at least 1 year in Snowflake. In-depth understanding of Data Warehousing, ETL concepts and modeling structure principles Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe. Person should know Snowflake Architecture Experience in SQL is must. Expertise on engineering platform components such as Data Pipelines, Data Orchestration, Data Quality, Data Governance Analytics Hands-on experience on implementing large-scale data intelligence solution around Snowflake DW Experience in scripting language such as Python or Scala is must Good experience on streaming services such as Kafka Experience working with Semi-Structured data Required Skills: SnowFlake, Snowflake SQL, Snowpipe, SQL
Posted 1 month ago
3.0 - 6.0 years
5 - 8 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
We are seeking an experienced Azure Data Engineer with 36 years of experience for a 6-month remote contract. The candidate will be responsible for developing and supporting IT solutions using technologies like Azure Data Factory, Azure Databricks, Azure Synapse, Python, PySpark, Teradata, and Snowflake. The role involves designing ETL pipelines, developing Databricks notebooks, handling CI/CD pipelines via Azure DevOps, and working on data warehouse modeling and integration. Strong skills in SQL, data lake storage, and deployment/monitoring are required. Prior experience in Power BI and DP-203 certification is a plus. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 1 month ago
7.0 - 10.0 years
1 - 3 Lacs
Pune
Work from Office
Job Description: Migration Projects: Lead the migration of legacy applications from ASP.NET 4.7 to ASP.NET Core, .NET Framework 4.7 to .NET Core 8, and EF 6 to EF Core. Good to have: VB6.0, Crystal Reports, Classic ASP, Git and Jira Full Stack Development: Design, develop, and maintain both front-end and back-end components using .NET technologies. Blazor Development: Utilize Blazor for building interactive web UIs. Technical Leadership: Mentor and guide a team of developers, ensuring best practices and high standards of code quality. System Architecture: Collaborate with architects and other stakeholders to design scalable and robust system architectures. Code Reviews: Conduct code reviews to ensure adherence to coding standards and best practices. Project Management: Manage project timelines, deliverables, and ensure successful project completion. Documentation: Create and maintain comprehensive documentation for all development and migration activities. Preferred Qualifications: Agile Methodologies: Familiarity with Agile development practices. Cloud Experience: Experience with cloud platforms such as Azure or AWS. Database Management: Proficiency in SQL Server or other relational databases
Posted 1 month ago
5.0 - 10.0 years
2 - 12 Lacs
Hyderabad, Telangana, India
On-site
Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications5+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fie
Posted 1 month ago
2.0 - 3.0 years
8 - 10 Lacs
Bengaluru
Hybrid
Role & responsibilities Responsibilities: • Design, develop, and maintain Tableau dashboards and reports • Collaborate with business stakeholders to gather and understand requirements and translate the same into effective visualizations that provide actionable insights • Creating wireframes and beta dashboards with a focus on user experience, correctness, and visibility • Optimize Tableau dashboards for performance and usability • Develop and maintain documentation related to Tableau solutions Preferred candidate profile Skills & Requirement (Must Have): • 2-3 years of experience working in developing, publishing maintaining and managing Tableau dashboards • Working knowledge of Tableau administration/architecture • Creating wireframes and beta dashboards with a focus on user experience, • correctness, and visibility • Strong proficiency with SQL and data modelling for analysis and building end to end • data pipelines. • Ability to write complex queries and understanding of database concepts • Ability to be effective in virtual as well as in person setup • Strong at turning data discoveries into analytical insights that drive business outcomes • Strong verbal and written communications skills Nice to have: • Experience in working with clickstream data and web analytics tools like Adobe Omniture, Google analytics • Experience with programming languages like Python & Unix Shell for data pipeline automation and analysis Education: Bachelors degree with at least 2 years of relevant experience in Business Intelligence team Perks and benefits
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
The Platform Data Engineer will be responsible for designing and implementing robust data platform architectures, integrating diverse data technologies, and ensuring scalability, reliability, performance, and security across the platform. The role involves setting up and managing infrastructure for data pipelines, storage, and processing, developing internal tools to enhance platform usability, implementing monitoring and observability, collaborating with software engineering teams for seamless integration, and driving capacity planning and cost optimization initiatives.
Posted 1 month ago
7.0 - 15.0 years
18 - 30 Lacs
Bengaluru, Karnataka, India
On-site
Java Backend MSB Engineer with Azure and Kafka Pan India Contract to hire with Cognizant Hybrid Mode (3 Days in Office) Need a minimum of 6.6 Years of experience Design develop and maintain robust backend systems using Java. Implement microservices architecture MSB principles. Build scalable and reliable data pipelines with Kafka. Deploy and manage applications on Azure cloud platform. Collaborate with cross functional teams to deliver high quality software.
Posted 1 month ago
5.0 - 10.0 years
15 - 27 Lacs
Bengaluru
Work from Office
Develop digital reservoir modeling tools and Petrel plugins. Integrate geological and geophysical data, apply ML and data engineering, and support forecasting through advanced cloud-based workflows. Required Candidate profile Earth scientist with experience in Petrel, Python, and Ocean plugin development. Strong background in reservoir modeling, digital workflows, and cloud-based tools (Azure, Power BI).
Posted 1 month ago
10.0 - 15.0 years
40 - 65 Lacs
Bengaluru
Work from Office
Design and lead scalable data architectures, cloud solutions, and analytics platforms using Azure. Drive data governance, pipeline optimization, and team leadership to enable business-aligned data strategies in the Oil & Gas sector Required Candidate profile Experienced data architect or leader with 10–15+ years in Azure, big data, and solution design. Strong in stakeholder management, data governance, and Oil & Gas analytics.
Posted 1 month ago
10.0 - 12.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Job Description Design, develop, and maintain Tableau dashboards and reports • Collaborate with business stakeholders to gather and understand requirements and translate the same into effective visualizations that provide actionable insights • Creating wireframes and beta dashboards with a focus on user experience, correctness, and visibility • Optimize Tableau dashboards for performance and usability • Develop and maintain documentation related to Tableau solutions Requirements 2-3 years of experience working in developing, publishing maintaining and managing Tableau dashboards • Working knowledge of Tableau administration/architecture • Creating wireframes and beta dashboards with a focus on user experience, • correctness, and visibility • Strong proficiency with SQL and data modelling for analysis and building end to end • data pipelines. • Ability to write complex queries and understanding of database concepts • Ability to be effective in virtual as well as in person setup • Strong at turning data discoveries into analytical insights that drive business outcomes • Strong verbal and written communications skills Nice to have: • Experience in working with clickstream data and web analytics tools like Adobe Omniture, Google analytics • Experience with programming languages like Python & Unix Shell for data pipeline automation and analysis Education: Bachelors degree with at least 2 years of relevant experience in Business Intelligence team.
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and Inviting applications for the role of Principal Consultant -Lead MLOps Engineer! In this role, you will define, implement and oversee the MLOps strategy for scalable, compliant, and cost-efficient deployment of AI/ GenAI models across the enterprise. This role combines deep DevOps knowledge, infrastructure architecture, and AI platform design to guide how teams build and ship ML models securely and reliably. You will establish governance, reuse, and automation frameworks for AI infrastructure, including Terraform-first cloud automation, multi-environment CI/CD, and observability pipelines. Responsibilities Architect secure, reusable, modular IaC frameworks across cloud and regions for MLOps Lead the development of CI/CD pipelines and standardize deployment frameworks. Design observability and monitoring systems for ML/ GenAI workloads. Collaborate with platform, data science, compliance and Enterprise Architecture teams to ensure scalable ML operations. Define enterprise-wide MLOps architecture and standards (build ? deploy ? monitor) Lead design of GenAI / LLMOps platform (Bedrock/OpenAI/Hugging Face + RAG stack) Integrate governance controls (approvals, drift detection, rollback strategies) Define model metadata standards, monitoring SLAs, and re-training workflows Influence tooling, hiring, and roadmap decisions for AI/ML delivery Be engaging in the design, development and maintenance of data pipelines for various AI use cases Required to actively contribution to key deliverables as part of an agile development team Qualifications we seek in you! Minimum Qualifications Good years of experience in DevOps or MLOps roles. Degree/qualification in Computer Science or a related field, or equivalent work experience Strong Python programming skills. Hands on experience in containerised deployment. Proficient with AWS (SageMaker, Lambda, ECR), Terraform, and Python. Demonstrated experience deploying multiple GenAI systems into production. Hands-on experience deploying 3-4 ML/ GenAI models in AWS. Deep understanding of ML model lifecycle: train ? test ? deploy ? monitor ? retrain. Experience in developing, testing, and deploying data pipelines using public cloud. Clear and effective communication skills to interact with team members, stakeholders and end users Knowledge of governance and compliance policies, standards, and procedures Exposure to RAG/LLM workloads and model deployment infrastructure. Experience in developing, testing, and deploying data pipelines Preferred Qualifications/ Skills Experience designing model governance frameworks and CI/CD pipelines. Knowledge of governance and compliance policies, standards, and procedures Advanced understanding of platform security, cost optimization, and ML observability. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
0.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About VOIS: VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group's partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India: In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Description Role purpose: Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. -- Strong communicator, experienced in leading & negotiating decision and effective outcomes. -- Strong overarching Data Architecture knowledge and experience with ability to govern application of architecture principles within projects VOIS Equal Opportunity Employer Commitment India: VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we'll be in touch!
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Senior Data Engineer (Remote, Contract 6 Months) Databricks, ADF, and PySpark. We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background #ContractDetails Role: Senior Data Engineer Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune
Posted 1 month ago
12.0 - 15.0 years
2 - 13 Lacs
Pune, Maharashtra, India
On-site
Manage, mentor, and guide a team of data engineers and analysts, ensuring their professional development and optimizing team performance. Foster a culture of collaboration, accountability, and continuous learning within the team. Lead performance reviews, provide career guidance, and handle resource planning. Data Engineering & Analytics: Design and implement data pipelines, data models, and architectures that are robust, scalable, and efficient. Develop and enforce data quality frameworks to ensure accuracy, consistency, and reliability of data assets. Establish and maintain data lineage processes to track the flow and transformation of data across systems. Ensure the design and maintenance of robust data warehousing solutions to support analytics and reporting needs. Collaboration and Stakeholder Management: Collaborate with stakeholders, including functional owners, analysts and business leaders, to understand business needs and translate them into technical requirements. Work closely with these stakeholders to ensure the data infrastructure supports organizational goals and provides reliable data for business decisions. Build and Foster relationships with major stakeholders to ensure Management perspectives on Data Strategy and its alignment with Business objectives. Project Management: Drive end-to-end delivery of analytics projects, ensuring quality and timeliness. Manage project roadmaps, prioritize tasks, and allocate resources effectively. Manage project timelines and mitigate risks to ensure timely delivery of high-quality data engineering projects. Technology and Infrastructure: Evaluate and implement new tools, technologies, and best practices to improve the efficiency of data engineering processes. Oversee the design, development, and maintenance of data pipelines, ensuring that data is collected, cleaned, and stored efficiently. Ensure there are no data pipeline leaks and monitor production pipelines to maintain their integrity. Familiarity with reporting tools such as Superset and Tableau is beneficial for creating intuitive data visualizations and reports. Machine Learning and GenAI Integration: Machine Learning: Knowledge of machine learning concepts and integration with data pipelines is a plus. This includes understanding how machine learning models can be used to enhance data quality, predict data trends, and automate decision-making processes. GenAI: Familiarity with Generative AI (GenAI) concepts and exposure is advantageous, particularly in enabling GenAI features on new datasets. Leveraging GenAI with data pipelines to automate tasks, streamline workflows, and uncover deeper insights is beneficial. What you'll Bring: 12+ years of experience in data engineering, with at least 3 years in a managerial role. Technical Expertise: Strong knowledge of data engineering concepts, including data warehousing, ETL processes, and data pipeline design. Proficiency in Azure Synapse or data factory, SQL, Python, and other data engineering tools. Data Modeling: Expertise in data modeling is essential, with the ability to design and implement robust, scalable data models that support complex analytics and reporting needs. Experience with data modeling frameworks and tools is highly valued. Leadership Skills: Proven ability to lead and motivate a team of engineers while managing cross-functional collaborations. Problem-Solving: Strong analytical and troubleshooting skills to address complex data-related challenges. Communication: Excellent verbal and written communication skills to effectively interact with technical and non-technical stakeholders. This includes the ability to motivate team members, provide regular constructive feedback, and facilitate open communication channels to ensure team alignment and success. Data Architecture: Experience with designing scalable, high-performance data systems and understanding cloud platforms such as Azure, Data Bricks. Machine Learning and GenAI: Knowledge of machine learning concepts and integration with data pipelines, as we'll as familiarity with GenAI, is a plus. Data Governance: Experience with data governance best practices is desirable. Open Mindset: An open mindset with a willingness to learn new technologies, processes, and methodologies is essential. The ability to adapt quickly to evolving data engineering lands
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireData Engineering Professionals in the following areas : Experience 10-12 Years Job Description Location: Pune J ob Summary: We are seeking a detail-oriented and technically proficient Technical Project Manager (TPM) with a strong background in data engineering, analytics, or data science. The TPM will be responsible for leading cross-functional teams to deliver data-centric projects on time, within scope, and within budget. This role bridges the gap between business needs and technical execution, ensuring alignment across stakeholders. Key Responsibilities: . Lead end-to-end project management for data and engineering initiatives, including planning, execution, and delivery. . Lead the planning, execution, and delivery of data-related projects (e.g., data platform migrations, analytics implementations, ML model deployments). . Collaborate with data engineers, analysts, and business stakeholders to define project scope, goals, and deliverables. . Develop detailed project plans, timelines, and resource allocations. . Manage project risks, issues, and changes to ensure successful delivery. . Ensure data quality, governance, and compliance standards are met. . Facilitate communication across technical and non-technical teams. . Track project performance using appropriate tools and techniques. . Conduct post-project evaluations and implement lessons learned. Required Qualifications: . Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. . 5+ years of experience in project management, with at least 2 years managing data-focused projects. . Strong understanding of data pipelines, ETL processes, cloud platforms (e.g., AWS, Azure), and data governance. . Proficiency with project management tools (e.g., Jira, MS Project). . Excellent communication, leadership, and stakeholder management skills. . Familiarity with BI tools (e.g., Power BI, Tableau). . PMP or Agile/Scrum certification is a plus. Required Technical/ Functional Competencies Change Management: Specialized in overcoming resistance to change and helping organizations achieve their Agile goals. Able to guide teams in driving the change management projects or requirements. Customer Management: Specialized knowledge of customers business domain and technology suite. Use latest technology, communicate effectively, demonstrate leadership, present technical offerings, and proactively suggest solutions. Delivery Management: Specialized knowledge of deal modeling, commercial and pricing models. Create an integrated pricing model across service lines. Guide team members to apply pricing techniques. Grow the account, forecast revenues and analyze complex internal reports. Manage at least 1 complex account (>10m) or multiple small account independently. Domain/ Industry Knowledge: Specialized knowledge of customers business processes and relevant technology platform or product. Able to forecast business requirements and market trends, manage project issues, and validate customer strategy roadmap. Product/Technology Knowledge: In-depth knowledge of platform/product and associated technologies. Review various product-specific solutions for a specific project/client/organization and conduct product demos, walkthroughs and presentations to prospects if required. Profitability Management: Demonstrate competence in applying profitability and cost management techniques. Can develop Project budgets, monitor actual costs against the budget, and identify potential cost overruns or deviations. Use established processes and tools to track and control project expenses. Project Management: Extensive experience in managing projects and can handle complex projects with minimal supervision. Deep understanding of project management concepts and methodologies and can apply them effectively to achieve project goals. Scheduling and Resource Planning: Prepare independent global delivery models covering skill levels, skill mix and onsite/offshore work allocation. Create an accurate resource planfor people, space and infrastructure for the given requirements. Forecast people and skill requirements to align with plans. Optimize the schedule for complex projects. Service Support and Maintenance: Plan and execute transition for large/ complex activities. Define standards in transition management based on industry trends and contribute to building tools and accelerators for KT process. Optimize resource utilization based on demand from customers. Select and define SLAs track service levels and analyze impact of SLA on complex processes and deliverables. Risk Management: Good understanding of risk management principles and techniques. Identify, assess, and document risks independently, as well as prioritize risks based on their potential impact. Assist in developing risk mitigation plans and monitoring risk responses. Required Behavioral Competencies Accountability: Being a role model for taking initiative and ensuring others take initiative, removing obstacles for others, taking ownership for results and deadlines for self and others, and acting as a role model for being responsible. Agility: Works with a diverse set of situations, people and groups and adapts and motivates self and team to thrive in changing environment. Collaboration: Reaches out to others in team to ensure connections are made and team members are working together. Looks for ways to integrate work with other teams, identifying similarities and opportunities, making necessary changes in work to ensure successful integration. Customer Focus: Engages in executive customer discovery to predict future needs of customers and drives customer relationships with a long-term focus and takes actions to enhance customer loyalty. Communication: Communicates and presents complex ideas, information, and data to multiple, broad, and demanding stakeholders internal and/or external to the Organization. Helps others communicate better with their audience. Demonstrates honest, direct, and transparent communication and facilitates conversations within the team and its close collaborators. Drives Results: Proactively seeks challenging and differentiated opportunities and drives and motivates team members to take on more responsibility. Resolves Conflict: Balances the business interests of all stakeholders and manages any conflicts offering mutually beneficial options. Certifications PMP (Project Management Professional), PRINCE2 (Projects in Controlled Environments) At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 month ago
6.0 - 8.0 years
6 - 8 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
6+ years of total IT experience in development projects. 4+ years experience in cloud-based solutions. 4+ years in solid hands on snowflake development. Hands on experience in Design, build data pipelines on cloud-based infrastructure having extensively worked on AWS, snowflake, Having done end to end build from ingestion, transformation, and extract generation in Snowflake. Strong hand-on experience in writing complex SQL queries. Good understanding and experience in Azure cloud services. Optimize and tune snowflake performance including query optimization and have experience in scaling strategies. Address data issues, root cause analysis and production support. Experience working in a Financial Industry. Understanding Agile methodologies. Certification on Snowflake and Azure will be added advantage.
Posted 1 month ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consulta nt - Le ad AWS Cloud Engineer! In this role, you will own the vision, architecture, and governance of cloud infrastructure supporting scalable, secure, and high-performing AI/ GenAI platforms across the enterprise. Your mandate includes building resilient, compliant, and cost-efficient cloud ecosystems primarily on AWS, but with a strong foundation for multi-cloud operability Responsibilities Define and maintain cloud infrastructure architecture across AWS accounts, environments, and regions. Architect multi-tenants, secure VPC and networking models, supporting cross-account and hybrid integrations. Standardize Infrastructure-as-Code (Terraform) strategy for AI/ML/ GenAI workloads across teams. Govern security frameworks, including encryption, IAM boundary enforcement, secrets management, and logging. Oversee cloud automation in CI/CD pipelines and support deployment of GenAI workloads (LLM APIs, vector DBs). Design, review and implement disaster recovery, backup, and high availability strategies. Optimize cloud cost and performance with tagging, resource planning, and usage analytics. Define and support multi-cloud readiness, including network peering, SSO/SAML, and logging across clouds. Collaborate with MLOps , Compliance, InfoSec, and Architecture teams to align infrastructure with enterprise goals. Engaging in the design, development and maintenance of data pipelines for various AI use cases Active contribution to key deliverables as part of an agile development team Collaborating with others to source, analyse, test and deploy data processes Qualifications we seek in you! Minimum Qualifications hands-on AWS infrastructure experience in production environments. Experience developing, testing, and deploying data pipelines Clear and effective communication skills to interact with team members, stakeholders and end users Degree/qualification in Computer Science or a related field, or equivalent work experience Knowledge of governance and compliance policies, standards, and procedures Proven ability to manage enterprise-wide IAC, AWS CLI, and Python or Bash scripting, versioning strategy. Expert in IAM, S3, DevOps, VPC, ECS/EKS, Lambda, and serverless computing. Experience supporting AI/ML or GenAI pipelines in AWS (especially for compute and networking). Hands on experience to multi-cloud architecture basics (e.g., SSO, networking, blob exchange, shared VPC setups). Deep understanding of CI/CD automation, AI workload optimization, and infrastructure governance. Hands-on experience designing or managing infrastructure in at least one other cloud (Azure or GCP). Hands on experience to multiple AI / ML /RAG/LLM workloads and model deployment infrastructure. AWS Certified Solutions Architect - Professional or Advanced Networking Specialty. Preferred Qualifications/ Skills Experience deploying infrastructure in both AWS and another major cloud provider (Azure or GCP). Designed or migrated enterprise workloads to multi-cloud or hybrid setups. Experience with cross-cloud monitoring, networking (VPNs, Transit Gateways), and DR policies. Familiarity with multi-cloud tools (e.g., HashiCorp Vault, Kubernetes with cross-cloud clusters). Strong understanding of DevSecOps best practices and compliance requirements. In-depth exposure to regulated industries (BFSI, healthcare) requiring auditability and compliance. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
0.0 - 5.0 years
0 - 5 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
We are seeking a skilled Big Data Engineer to design, build, and maintain scalable big data processing pipelines. The ideal candidate will be proficient in Python and PySpark, have hands-on experience with cloud platforms, and be familiar with CI/CD processes to automate deployments and workflows. Responsibilities: Develop and maintain data pipelines using Python and PySpark. Design and implement scalable big data solutions on cloud platforms (AWS, Azure, or GCP). Build and manage CI/CD pipelines for automated deployment and testing. Collaborate with data scientists, analysts, and other engineers to deliver robust data infrastructure. Optimize data processing workflows for performance and reliability. Key Skills: Python PySpark Cloud platforms: AWS / Azure / GCP (any one or more) CI/CD pipelines and tools
Posted 1 month ago
5.0 - 7.0 years
5 - 7 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
We are seeking an experienced AWS Data Engineer to support the on-ground client team in delivering high-quality data engineering solutions. The ideal candidate will have hands-on experience with AWS data services and strong programming skills. Key Responsibilities: Collaborate with the client's onsite team to ensure timely completion of deliverables Design, develop, and maintain data pipelines using AWS services such as Lambda, Glue, and Redshift Write efficient, reusable, and optimized code in Python and SQL for data processing and transformation Monitor, troubleshoot, and optimize data workflows for performance and reliability Implement best practices in data engineering and cloud infrastructure management Participate in code reviews, testing, and documentation activities Required Skills: Proven experience with AWS Data Engineering services: Lambda, Glue, Redshift Strong programming skills in Python and SQL Experience working in collaboration with onsite client teams to meet project goals Good problem-solving and communication skills Knowledge of data pipeline design, ETL/ELT processes, and cloud data management
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 month ago
3.0 - 8.0 years
3 - 7 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Key Responsibilities: Develop, design and build scalable and modular software components for quantitative analysis and financial modeling and investment analytics for risk models, and high-volume complex data processing pipelines. Ensure quality & performance of research data pipeline and curated new premium content using different approaches Conduct research on additional content / data which can add value to models and or investment processes Perform & implement complex quantitative calculations with high accuracy and performance Collaborate with modelers, content experts to develop content expertise and implement optimized and performant solutions Apply statistical methods to real-world financial data & climate data to derive business insights Optimize algorithms for time-series data analysis and financial computations Skills & Qualifications: Bachelor or master s level education in Computer Science, Engineering, or a related discipline Minimum 3+ years of experience in Python-based full scale production software development and design Formidable analytical, problem-solving, and production troubleshooting skills Understanding of climate/ESG vendors, climate datasets, and standards A passion for providing fundamental software solutions for highly available, performant full stack applications with a Student of Technology attitude Passion to work in a team-environment, multitasking, and effective communication skills Knowledge of software development methodologies (analysis, design, development, testing) and basic understanding of Agile / Scrum methodology and practices Ability and willingness to learn fast, multi-task, self-motivate and pick up new things easily Ability to work independently and efficiently in a fast-paced and team-oriented environment Good to Have Understanding of Agile work environments, including knowledge of GIT, CI/CD. Knowledge of investment process, climate risk particularly transition risk & decarbonization analytics. Exposure to curate unstructured data using NLP / Gen AI /LLM CFA/FRM preferred
Posted 1 month ago
4.0 - 7.0 years
4 - 7 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Roles & Responsibilities Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data management tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with partners to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Masters degree with 1 to 3 years of experience in Data Engineering OR Bachelors degree with 1 to 3 years of experience in Data Engineering Must-HaveSkills: Minimum of 1 year of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 1 year of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Experience using cloud platforms (AWS), data lakes, and data warehouses. Working knowledge of ETL processes, data pipelines, and integration technologies. Good communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling and data anlysis Good-to-Have Skills: Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications: ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification.
Posted 1 month ago
1.0 - 3.0 years
1 - 3 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Roles & Responsibilities Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data management tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with partners to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Masters degree with 1 to 3 years of experience in Data Engineering OR Bachelors degree with 1 to 3 years of experience in Data Engineering Must-HaveSkills: Minimum of 1 year of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 1 year of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Experience using cloud platforms (AWS), data lakes, and data warehouses. Working knowledge of ETL processes, data pipelines, and integration technologies. Good communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling and data anlysis
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough