Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Senior Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Execute projects with an Agile mindset Build software frameworks to solve data problems at scale Technical Requirements: 3+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required Strong programming / scripting experience using SQL and python and Spark Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Senior Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Who We Are... When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next... From brilliant creatives to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. About Warner Bros. Discovery Warner Bros. Discovery, a premier global media and entertainment company, offers audiences the world's most differentiated and complete portfolio of content, brands and franchises across television, film, streaming and gaming. The new company combines Warner Media’s premium entertainment, sports and news assets with Discovery's leading non-fiction and international entertainment and sports businesses. For more information, please visit www.wbd.com. Meet Our Team The Direct-to-Consumer (DTC) global tech organization has many software engineering teams build applications for the web, mobile, tablets, connected TVs, consoles, and other streaming devices. Our platform covers everything from search, catalogue, video transcoding, personalization, to global subscriptions, and really much more. Every customer starts their wonderful journey into the world of WBD through DTC’s Identity and Growth teams. We ensure customers can seamlessly authenticate and authorize across all WBD brands. We are a fast-growing, global engineering group crucial to WBD’s future. Senior Site Reliability Engineer: - Roles and Responsibilities: - You are a subject matter expert on platform functionality, shield engineering teams by navigating and troubleshooting production issues efficiently with quick turnaround time. Able to review source code, logs, operational metrics, stack trace, etc. to pinpoint a specific problem and then resolve it. You identify learnings from operations to improve platform engineering excellence, you continuously seek to break barriers between engineering and operations teams. Drive the reliability and scalability of cloud-based systems while identifying and implementing improvements for operational efficiency and proactive monitoring. Automation and Tool Development: Continuously seek opportunities to automate workflows, develop self-sustainable tools, and improve operational efficiency. Incident Management: Facilitate partner inquiries and production incidents, ensuring compliance with internal SLAs. Responsibilities include responding to, investigating, and mitigating customer impact. Partner with engineering teams, and other stakeholders to support product launches and other initiatives. Drive observability and monitoring. Always champion engineering and operational excellence. Is a result-driven creative thinker who drives innovation and produces delightful experiences for our customers. Demonstrate data-driven open-minded decision making, have an insatiable curiosity, love to invent and innovate to solve difficult challenges Takes ownership of their work and consistently delivers results in a fast-paced environment. Actively support hyper-care and watch party events, providing real- time operational metrics and insights. Perform health checks on critical applications and services, ensuring uptime and availability. Write complex queries and scripts, analyze datasets, and pinpoint issues efficiently. Effectively communicate with global partners and stakeholders. Exercise good judgment when balancing immediate and long-term business needs. What To Bring Bachelor’s degree with 5 – 8 years of experience as a software developer. Proficient in Golang (preferred) or Java languages. Monitoring & Alerting: Experience implementing alerting, metrics, and logging using tools like Prometheus, CloudWatch, Elastic, and PagerDuty. Direct experience with at least one cloud provider (AWS, GCP, Azure, or other). Strong expertise in SQL with 6–8 years of hands-on experience working with databases. Experience building dashboards using tools like Databricks and Grafana. Familiarity with OAuth 2.0 authentication framework. Experience with tools such as PagerDuty and ServiceNow is a plus. Proficiency with Docker, Kubernetes, and AWS is a plus. Ability to work flexible shifts to provide global operational coverage and collaborate effectively with remote peers across disparate geographies and time zones. This role offers an opportunity to innovate and work on cutting-edge technologies, making a meaningful impact on the reliability and efficiency of partner integrations. If you are a proactive problem-solver with a passion for customer success, we encourage you to apply! What We Offer A Great Place to work. Equal opportunity employer Fast track growth opportunities How We Get Things Done... This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects the diversity of our society and the world around us. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Title: Ignition Application Administrator Position: We are seeking a highly motivated Ignition Application Administrator to join the Enterprise Services – Data team. Working very closely with peer platform administrators, developers, Product/Project Seniors and Customers, you will play an active role in administering the existing analytics platforms. You will join a team of platform administrators who are specialized in one tool, but cross-trained on other tools. While you will focus on Ignition, administration knowledge of these other platforms is beneficial – Qlik Sense, Tableau, PowerBI, SAP Business Objects, Matillion, Snowflake, Informatica (EDC, IDQ, Axon), Alteryx, HVR or Databricks. This role requires a willingness to dive into complex problems to help the team find elegant solutions. How you communicate and approach problems is important to us. We are looking for team players, who are willing to bring people across the disciplines together. This position will provide the unique opportunity to operate in a start-up-like environment within a Fortune 50 company. Our digital focus is geared towards releasing the insights inherent to our best-in-class products and services. Together we aim to achieve new levels of productivity by changing the way we work and identifying new sources of growth for our customers. Responsibilities include, but are not limited to, the following: Install and configure Ignition. Monitor the Ignition platform, including integration with observability and alerting solutions, and recommend platform improvements. Troubleshoot and resolve Ignition platform issues. Configure data source connections and manage asset libraries. Identify and raise system capacity related issues (storage, licenses, performance threshold). Define best practices for Ignition deployment. Integrate Ignition with other ES Data platforms and Business Unit installations of Ignition. Participate in overall data platform architecture and strategy. Research and recommend alternative actions for problem resolution based on best practices and application functionality with minimal direction. Knowledge and Skills: 3+ years working in customer success or in a customer-facing engineering capacity is required. Large scale implementation experience with complex solutions environment. Experience in customer-facing positions, preferably industry experience in technology-based solutions. Experience being able to navigate, escalate and lead efforts on complex customer/partner requests or projects. Experience with Linux command line. An aptitude for both analysing technical concepts and translating them into business terms, as well as for mapping business requirements into technical features. Knowledge of the software development process and of software design methodologies helpful 3+ years’ experience in a cloud ops / Kubernetes application deployment and management role, working with an enterprise software or data product. Experience with Attribute-based Access Control (ABAC), Virtual Director Services (VDS), PING Federate or Azure Active Directory (AAD) helpful. Cloud platform architecture, administration and programming experience desired. Experience with Helm, Argo CD, Docker, and cloud networking. Excellent communication skills: interpersonal, written, and verbal. Education and Work Experience: This position requires a minimum A BA/BS Degree (or equivalent) in technology, computing or other related field of study. Experience in lieu of education may be considered if the individual has ten (3+) or more years of relevant experience. Hours: Normal work schedule hours may vary, Monday through Friday. May be required to work flexible hours and/or weekends, as needed, to meet deadlines or to fulfil application administration obligations. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
8.0 - 13.0 years
9 - 19 Lacs
Pune
Hybrid
** Mandate Skills : API development, Java, Databricks and AWS ** Technical Two or more years of API Development experience (specifically Rest APIs using Java, Spring boot, Hibernate) Two or more years of Data Engineering and the respective tools and technologies (e.g., Apache Spark, Databricks, SQL DB, NoSQL DB, Data Lake concepts) Working knowledge of Test-driven development Working knowledge of experience leveraging DevOps and lean development principles such as Continuous Integration, Continuous Delivery/Deployment using tools like Git Working knowledge of ETL, Data Modeling, Data Warehousing, and working with large-scale datasets Working Knowledge of AWS services such as Lambda, RDS, ECS, DynamoDB, API Gateway, S3 etc. Good to have: AWS Developer certification or working experience in AWS cloud or other cloud technologies Passionate, creative and have the desire to learn new complex technical areas Accountable, curious, and collaborative with an intense focus on product quality Skilled in interpersonal communications and ability to communicate complex topics to non-technical audiences Experience working in an agile team environment
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Tarento Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions. We're proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you’ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Role Overview An Azure Data Engineer specializing in Databricks is responsible for designing, building, and maintaining scalable data solutions on the Azure cloud platform, with a focus on leveraging Databricks and related big data technologies. The role involves close collaboration with data scientists, analysts, and software engineers to ensure efficient data processing, integration, and delivery for analytics and business intelligence needs245. Key Responsibilities Design, develop, and maintain robust and scalable data pipelines using Azure Databricks, Azure Data Factory, and other Azure services. Build and optimize data architectures to support large-scale data processing and analytics. Collaborate with cross-functional teams to gather requirements and deliver data solutions tailored to business needs. Ensure data quality, integrity, and security across various data sources and pipelines. Implement data governance, compliance, and best practices for data security (e.g., encryption, RBAC). Monitor, troubleshoot, and optimize data pipeline performance, ensuring reliability and scalability. Document technical specifications, data pipeline processes, and architectural decisions Support and troubleshoot data workflows, ensuring consistent data delivery and availability for analytics and reporting Automate data tasks and deploy production-ready code using CI/CD practices Stay updated with the latest Azure and Databricks features, recommending improvements and adopting new tools as appropriate Required Skills And Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field 5+ years of experience in data engineering, with hands-on expertise in Azure and Databricks environments Proficiency in Databricks, Apache Spark, and Spark SQL Strong programming skills in Python and/or Scala Advanced SQL skills and experience with relational and NoSQL databases Experience with ETL processes, data warehousing concepts, and big data technologies (e.g., Hadoop, Kafka) Familiarity with Azure services: Azure Data Lake Storage (ADLS), Azure Data Factory, Azure SQL Data Warehouse, Cosmos DB, Azure Stream Analytics, Azure Functions Understanding of data modeling, schema design, and data integration best practices Strong analytical, problem-solving, and troubleshooting abilities Experience with source code control systems (e.g., GIT) and technical documentation tools Excellent communication and collaboration skills; ability to work both independently and as part of a team Preferred Skills Experience with automation, unit testing, and CI/CD pipelines Certifications in Azure Data Engineering or Databricks are advantageous Soft Skills Flexible, self-starter, and proactive in learning and adopting new technologies Ability to manage multiple priorities and work to tight deadlines Strong stakeholder management and teamwork capabilities Show more Show less
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad, Bengaluru
Work from Office
RSM is looking for an experienced Hands-On Technical Manager with expertise in big data technologies and multi-cloud platforms to lead our technical team for the financial services industry. The ideal candidate will possess a strong background in big data architecture, cloud computing, and a deep understanding of the financial services industry. As a Technical Manager, you will be responsible for leading technical projects, hands-on development, delivery management, sales and ensuring the successful implementation of data solutions across multiple cloud platforms. This role requires a unique blend of technical proficiency, sales acumen, and presales experience to drive business growth and deliver innovative data solutions to our clients. Responsibilities: Provide technical expertise and guidance on the selection, and hands-on implementation, and optimization of big data platforms, tools, and technologies across multiple cloud environments (e.g., AWS, Azure, GCP, Snowflake, etc.) Architect and build scalable and secure data pipelines, data lakes, and data warehouses to support the storage, processing, and analysis of large volumes of structured and unstructured data. Lead and mentor a team of technical professionals in the design, development, and implementation of big data solutions and data analytics projects within the financial services domain. Stay abreast of emerging trends, technologies, and industry developments in big data, cloud computing, and financial services, and assess their potential impact on the organization. Develop and maintain best practices, standards, and guidelines for data management, data governance, and data security in alignment with regulatory requirements and industry standards. Collaborate with the sales and business development teams to identify customer needs, develop solution proposals, and present technical demonstrations and presentations to prospective clients. Collaborate with cross-functional teams including data scientists, engineers, business analysts, and stakeholders to define project requirements, objectives, and timelines. Basic Qualifications: Bachelor's degree or higher in Computer Science, Information Technology, Business Administration, Engineering or related field. Minimum of ten years of overall technical experience in solution architecture, design, hands-on development with a focus on big data technologies, multi-cloud platforms, and with at-least 5 years of experience specifically in financial services. Strong understanding of the financial services industry - capital markets, retail and business banking, asset management, insurance, etc. In-depth knowledge of big data technologies such as Hadoop, Spark, Kafka, and cloud platforms such as AWS, Azure, GCP, Snowflake, Databricks, etc. Experience with SQL, Python, Pyspark or other programming languages used for data transformation, analysis, and automation. Excellent communication, presentation, and interpersonal skills, with the ability to articulate technical concepts to both technical and non-technical audiences. Hands-on experience extracting (ETL using CDC, Transaction Logs, Incremental) and processing large data sets for Streaming and Batch data loads. Ability to work from our Hyderabad, India office at least twice a week Preferred Qualifications: Professional certifications in cloud computing (e.g., AWS Certified Solutions Architect, Microsoft Certified Azure Solutions Architect, Azure Data Engineer, SnowPro Core) and/or big data technologies. Experience with Power BI, Tableau or other Reporting and Data Visualization tools Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code tools Education/Experience: Bachelor s degree in MIS, CS, Engineering or equivalent field. Master s degree is CS or MBA is preferred. Advanced Data and Cloud Certifications are a plus.
Posted 1 week ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position Overview: We’re looking for a hands-on Data Lead to architect and deliver end-to-end data features that power our booking and content management systems. You’ll lead a team, own the data pipeline lifecycle, and play a critical role in shaping and scaling our data infrastructure — spanning batch, real-time, and NoSQL environments. ShyftLabs is a growing data product company founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation. Job Responsibilities: Lead the development and implementation of booking and CMS data features according to the roadmap Build, optimize, and manage robust data pipelines and ETL/ELT processes using tools like Airflow, DBT, and Databricks Oversee infrastructure and storage layers across distributed systems (e.g., Cassandra, MongoDB, Postgres), ensuring scalability, availability, and performance Support and partner with Data PMs to deliver clean, actionable data for reporting, analytics, and experimentation Handle client data queries, investigate anomalies, and proactively improve data quality and debugging workflows Manage a team of data engineers and analysts: provide architectural direction, review code, and support career growth Collaborate with DevOps/Platform teams on deployment, monitoring, and performance optimization of data services Champion best practices in data governance, version control, security, and documentation Basic Qualifications: 8+ years of experience in data engineering or analytics engineering with a strong focus on both data modeling and infrastructure 2+ years of experience of managing and guiding data team to realize complicated data features. Proficient in SQL, Python, and working with modern data stack tools (e.g., DBT, Databricks, Airflow) Experience managing distributed and NoSQL databases (e.g., Cassandra, MongoDB), and cloud data warehouses (e.g., Snowflake, BigQuery) Strong understanding of scalable data architecture and real-time streaming pipelines is a plus Experience in leading teams, setting code standards, and mentoring junior developers Ability to translate business requirements into scalable, maintainable data systems Familiarity with booking platforms, CMS architectures, or event-based tracking systems is a plus! We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Responsibilities Manage Data: Extract, clean, and structure both structured and unstructured data. Coordinate Pipelines: Utilize tools such as Airflow, Step Functions, or Azure Data Factory to orchestrate data workflows. Deploy Models: Develop, fine-tune, and deploy models using platforms like SageMaker, Azure ML, or Vertex AI. Scale Solutions: Leverage Spark or Databricks to handle large-scale data processing tasks. Automate Processes: Implement automation using tools like Docker, Kubernetes, CI/CD pipelines, MLFlow, Seldon, and Kubeflow. Collaborate Effectively: Work alongside engineers, architects, and business stakeholders to address and resolve real-world problems efficiently. Qualifications 3+ years of hands-on experience in MLOps (4-5 years of overall software development experience). Extensive experience with at least one major cloud provider (AWS, Azure, or GCP). Proficiency in using Databricks, Spark, Python, SQL, TensorFlow, PyTorch, and Scikit-learn. Expertise in debugging Kubernetes and creating efficient Dockerfiles. Experience in prototyping with open-source tools and scaling solutions effectively. Strong analytical skills, humility, and a proactive approach to problem-solving. Preferred Qualifications Experience with SageMaker, Azure ML, or Vertex AI in a production environment. Commitment to writing clean code, creating clear documentation, and maintaining concise pull requests. Skills: sql,kubeflow,spark,docker,databricks,ml,gcp,mlflow,kubernetes,aws,pytorch,azure,ci/cd,tensorflow,scikit-learn,seldon,python,mlops Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a skilled Data Engineer with hands on Azure SQL, Databricks , Data Factory experience to join our dynamic team. The successful candidate will play a crucial role in developing, maintaining, and optimizing data-driven applications and solutions. Responsibilities Design, develop, test, and deploy high-performance and scalable data solutions . Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications Implement efficient and maintainable code using best practices and coding standards. Utilize expertise in SQL to design, optimize, and maintain relational databases Write complex SQL queries for data retrieval, manipulation, and analysis. Perform database performance tuning and optimization. Work with Databricks platform for big data processing and analyze. Develop and maintain ETL processes using Databricks notebooks. Implement and optimize data pipelines for data transformation and integration. Show more Show less
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Adobe is seeking dedicated Product Analytics Experts to join our growing team in Noida. In this role, you will play key part in driving the success of Adobe's Document Cloud products by using your expertise to understand user behavior, identify growth opportunities, and help drive data-driven decisions. Responsibilities: Analyze large datasets to identify trends, patterns, and key performance indicators . Develop and maintain SQL queries to extract, transform, and load data from various sources, including Hadoop and cloud-based platforms like Databricks. Develop compelling data visualizations using Power BI and Tableau to communicate insights seamlessly to PMs/ Engineering and leadership. Conduct A/B testing and campaign analysis, using statistical methods to measure and evaluate the impact of product changes. Partner with cross-functional teams (product managers, engineers, marketers) to translate data into actionable insights and drive strategic decision-making. Independently own and manage projects from inception to completion, ensuring timely delivery and high-quality results. Effectively communicate analytical findings to stakeholders at all levels, both verbally and in writing. Qualifications: 8-12 years of relevant experience in solving deep analytical challenges within a product or data-driven environment. Strong proficiency in advanced SQL, with experience working with large-scale datasets. Expertise in data visualization tools such as Power BI and Tableau. Hands-on experience in A/B testing, campaign analysis, and statistical methodologies. Working knowledge of scripting languages like Python or R, with a foundational understanding of machine learning concepts. Experience with Adobe Analytics is a significant plus. Good communication, presentation, and interpersonal skills. A collaborative mindset with the ability to work effectively within cross-functional teams. Strong analytical and problem-solving skills with a passion for data-driven decision making. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Show more Show less
Posted 1 week ago
6.0 - 10.0 years
10 - 17 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Job Description: We are looking for a skilled Data / Analytics Engineer with hands-on experience in vector databases and search optimization techniques . You will help build scalable, high-performance infrastructure to support AI-powered applications like semantic search , recommendation systems , and RAG pipelines . Key Responsibilities: Optimize vector search algorithms for performance and scalability. Build pipelines to process high-dimensional embeddings (e.g., BERT , CLIP , OpenAI ). Implement ANN indexing techniques like HNSW , IVF , PQ . Integrate vector search with data platforms and APIs . Collaborate with cross-functional teams (data scientists, engineers, product). Monitor and resolve latency , throughput , and scaling issues. Must-Have Skills: Python AWS Vector Databases (e.g., Elasticsearch , FAISS , Pinecone ) Vector Search / Similarity Search ANN Search Algorithms HNSW , IVF , PQ Snowflake / Databricks Embedding Models – BERT , CLIP , OpenAI Kafka / Flink for real-time data pipelines REST APIs , GraphQL , or gRPC for integration Good to Have: Knowledge of semantic caching and hybrid retrieval Experience with distributed systems and high-performance computing Familiarity with RAG (Retrieval-Augmented Generation) workflows Apply Now if You: Enjoy solving performance bottlenecks in AI infrastructure Love working with cutting-edge ML models and search technologies Thrive in collaborative , fast-paced environments
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities We are seeking a developer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. B.Tech degree and 5+ years of ETL development experience in Microsoft data track are required. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory Skill Sets ETL Development Preferred Skill Sets Microsoft Stack Years Of Experience Required 4+ Education Qualification B.Tech/B.E./MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills ETL Development Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, API Management, Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity, Debugging, Embracing Change, Emotional Regulation {+ 30 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Key Responsibilities: Project Management: Lead the end-to-end delivery of data projects, including Data Warehouse, Data Lake, and Lakehouse solutions. Develop detailed project plans, allocate resources, and monitor project progress to ensure timely and within-budget delivery. Identify and mitigate risks, ensuring successful project outcomes. Technical Leadership: Provide technical oversight and guidance on best practices in data engineering, cloud architecture, and data management. Ensure solutions are scalable, robust, and align with industry standards and client requirements. Oversee the design, development, and implementation of data solutions using Azure or AWS and Databricks. Client Engagement: Engage with clients to understand their business needs and translate them into technical requirements. Build and maintain strong relationships with key client stakeholders. Present complex technical concepts and solutions in a clear and concise manner to non- technical stakeholders. Team Leadership: Lead and mentor a team of data engineers, fostering a collaborative and high-performance culture. Provide guidance and support to team members in their professional development and project delivery. Ensure the team is equipped with the necessary tools and resources to succeed. Solution Development: Develop and implement data pipelines, ETL processes, and data integration solutions using Azure Data Factory, AWS Glue, Databricks, and other relevant tools. Optimize data storage and retrieval performance, ensuring data quality and integrity. Leverage advanced analytics and machine learning capabilities within Databricks to drive business insights. Continuous Improvement: Stay up-to-date with the latest advancements in Azure, AWS, Databricks, and data engineeringtechnologies. Implement best practices and continuous improvement initiatives to enhance the efficiency and effectiveness of data engineering processes. Foster a culture of innovation and experimentation within the team. Skills & Competencies Strong problem-solving and analytical skills. Deep technical expertise in Azure, Google or AWS and Databricks. Exceptional project management and organizational abilities. High level of emotional intelligence and client empathy. Proficiency In Concepts of Data warehousing and Data Lake (e.g., SCD1, SCD2, Dimensional Modeling, KPIs and Measures, Data Catalog, Star and Snowflake schema, Delta Table and Delta Live Tables) Data warehousing solutions (e.g., Azure Synapse, Azure SQL, ADLS Gen2, BLOB Storage for Azure and Redshift, S3, AWS Glue AWS Lambda for AWS, Google data management technologies ) Data Lake solutions (e.g., MS Fabric, Purview, AWS Lakehouse, BigQuery and BigTable ) Lakehouse solutions (e.g., Databricks, Unity Catalog, Python and Pyspark) Data visualization tools (e.g., Power BI,Tableau) is a plus Mandatory Skill Sets Project Management, Azure, AWS Preferred Skill Sets Project Management, Azure, AWS Years Of Experience Required 10+ Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills AWS Devops, Microsoft Azure, Waterfall Model Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
0 years
3 - 5 Lacs
Cochin
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Gig- Databricks Administrator As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in sectors like Banking, Insurance, Manufacturing, Government Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity Design, provision, and operate Azure Databricks workspaces; enforce security, cost, and performance policies; and serve as the escalation point for data engineering and ML teams. Success in this position means delivering a reliable, well-governed platform that accelerates product teams while meeting enterprise compliance obligations. Your key responsibilities Platform Administration & Governance – Create and manage clusters, cluster policies, and pools; schedule DBR upgrades; configure VNet-injected workspaces, SCIM user/group provisioning, and secret scopes; monitor jobs, audit logs, and system tables for health & cost anomalies. Unity Catalog Stewardship – Stand up metastores, catalogs, and schemas; implement object ownership, row/column security, data lineage, and privilege delegation. ETL & Delta Live Tables Support – Partner with data engineers to build and optimize Spark/Delta pipelines, streaming jobs, and orchestration in Databricks Workflows. Security & Data Privacy – Integrate AAD SSO, network security controls Automation & IaC – Implement Terraform/Databricks Terraform provider, Databricks CLI, and CI/CD pipelines for workspace and Unity Catalog objects. Cost Optimisation & Observability – Tag resources, set auto-termination, right-size clusters Documentation & Knowledge Transfer – Maintain runbooks, SLA playbooks, and deliver platform onboarding sessions. Skills and attributes for success Databricks Platform Administrator Accreditation Databricks Certified Data Engineer Professional Databricks Lakehouse Fundamentals Accreditation Microsoft Azure Administrator Associate (AZ-104) (Preferred) Microsoft Azure Data Engineer Associate (DP-203) or successor certification Nice-to-Have Immuta implementation or comparable data-privacy platform experience. To qualify for the role, you must: 5+ yrs administering cloud big-data platforms, 2+ yrs Azure Databricks Proven track record supporting Spark ETL and ML workloads at scale. Deep knowledge of Apache Spark 3.x, Delta Lake, Databricks REST API Azure core services (ADLS Gen2, Key Vault, VNets) Scripting/IaC: Python, PowerShell/Bash, Terraform. Hands-on Unity Catalog privilege model, lineage, and audit tables. Excellent incident-management, documentation, and stakeholder-communication skills. What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
8.0 - 10.0 years
2 - 8 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager Technology – US Commercial Data & Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will lead the engagement model between Amgen's Technology organization and our global business partners in Commercial Data & Analytics. We seek a technology leader with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders. Are you interested in building a team that consistently delivers business value in an agile model using technologies such as AWS, Databricks, Airflow, and Tableau? Come join our team! Roles & Responsibilities: Establish an effective engagement model to collaborate with senior leaders on the Sales Insights product team within the Commercial Data & Analytics organization, focused on operations within the United States Serve as the technology product owner for an agile product team committed to delivering business value to Commercial stakeholders via data pipeline buildout for sales data Lead and mentor junior team members to deliver on the needs of the business Interact with business clients and technology management to create technology roadmaps, build cases, and drive DevOps to achieve the roadmaps Help to mature Agile operating principles through deployment of creative and consistent practices for user story development, robust testing and quality oversight, and focus on user experience Ability to connect and understand our vast array Commercial and other functional data sources including Sales, Activity, and Digital data, etc. into consumable and user-friendly modes (e.g., dashboards, reports, mobile, etc.) for key decision makers such as executives, brand leads, account managers, and field representatives. Become the lead subject matter expert in reporting technology capabilities by researching and implementing new tools and features, internal and external methodologies What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 8 - 10 years of experience in Information Systems experience OR Bachelor’s degree with 10 - 14 years of experience in Information Systems experience OR Diploma with 14 - 18 years of experience in Information Systems experience Must-Have Skills Excellent problem-solving skills and a passion for tackling complex challenges in data and analytics with technology Experience leading data and analytics teams in a Scaled Agile Framework (SAFe) Excellent interpersonal skills, strong attention to detail, and ability to influence based on data and business value Ability to build compelling business cases with accurate cost and effort estimations Has experience with writing user requirements and acceptance criteria in agile project management systems such as Jira Ability to explain sophisticated technical concepts to non-technical clients Strong understanding of sales and incentive compensation value streams Preferred Qualifications: Jira Align & Confluence experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Understanding of software systems strategy, governance, and infrastructure Experience in managing product features for PI planning and developing product roadmaps and user journeys Familiarity with low-code, no-code test automation software Technical thought leadership Soft Skills: Able to work effectively across multiple geographies (primarily India, Portugal, and the United States) under minimal supervision Demonstrated proficiency in written and verbal communication in English language Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Intellectual curiosity and the ability to question partners across functions Ability to prioritize successfully based on business value High degree of initiative and self-motivation Ability to manage multiple priorities successfully across virtual teams Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: ETL tools: Experience in ETL tools such as Databricks Redshift or equivalent cloud-based dB Big Data, Analytics, Reporting, Data Lake, and Data Integration technologies S3 or equivalent storage system AWS (similar cloud-based platforms) BI Tools (Tableau and Power BI preferred) What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 years
7 - 10 Lacs
Hyderābād
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Corporate Title: Senior Associate Functional Title: Senior Snowflake Data Engineer Exempt Business Unit: Information Technology Department: Application Development Business Unit Description: The Information Technology group delivers secure, reliable technology solutions that enable DTCC to be the trusted infrastructure of the global capital markets. The team delivers high-quality information through activities that include development of essential, building infrastructure capabilities to meet client needs and implementing data standards and governance. Position Summary Provides technical expertise and may coordinate some day to day deliverables for a team. Assists in the technical design of large business systems; builds applications, interfaces between applications, understands data security, retention, and recovery. Can research technologies independently and recommend appropriate solutions. Contributes to technology-specific best practices & standards; contributes to success criteria from design through deployment, including, reliability, cost-effectiveness, performance, data integrity, maintainability, reuse, extensibility, usability and scalability; contributes expertise on significant application components, vendor products, program languages, databases, operating systems, etc., and guides less experienced staff during the build and test phases. Specific Responsibilities Act as a technical guide on one or more applications utilized by DTCC. Work with the Business System Analyst to ensure designs satisfy functional requirements. Work with large, complex data sets and high throughput data pipelines that meet business requirements. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. Build data and analytics tools that utilize the data pipeline to provide actionable insights to operational efficiency and other key business performance metrics. Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs. Collaborate with data scientists and architects on several projects. Solve various complex problems. Key skills: 5+ Years of Experience as Data Engineer or in similar role 4+ Years Cloud Datawarehouse Snowflake experience with Stream, Snowpipe ,Task, Snowpark etc 4+ years of Python development experience is necessary. Experience in distributed processing frameworks like spark, databricks, apache iceberg, data lakehouse architecture patterns Experience in a cloud-based environment. Experience with asynchronous processing using Python. Hands-on experience with database technologies (e.g. SQL and NoSql) with performance tuning. Technical expertise with data technologies and/or machine learning techniques Great numerical and analytical skills Ability to write reusable code components. Open-minded to the new technologies, frameworks Qualifications Minimum of 6 years of related experience Bachelor's degree preferred or equivalent experience With 50 years of experience, DTCC is the premier post-trade market infrastructure for the global financial services industry. From 20 locations around the world, DTCC, through its subsidiaries, automates, centralizes, and standardizes the processing of financial transactions, mitigating risk, increasing transparency and driving efficiency for thousands of broker/dealers, custodian banks and asset managers. Industry owned and governed, the firm simplifies the complexities of clearing, settlement, asset servicing, data management, data reporting and information services across asset classes, bringing increased security and soundness to financial markets. In 2022, DTCC’s subsidiaries processed securities transactions valued at U.S. $2.5 quadrillion. Its depository provides custody and asset servicing for securities issues from over 150 countries and territories valued at U.S. $72 trillion. DTCC’s Global Trade Repository service, through locally registered, licensed, or approved trade repositories, processes more than 17.5 billion messages annually. To learn more, please visit us at www.dtcc.com or connect with us on LinkedIn, Twitter, YouTube and Facebook. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
6.0 years
15 Lacs
Hyderābād
On-site
Experience- 6+ years JD- Experience in Perform Design, Development & Deployment using Azure Services (Data Factory, Databricks, PySpark , SQL) Minimum 2 year of Project experience in Azure Databricks Minimum 2 years of experience in ADF Minimum 2 years of experience in PySpark Develop and maintain scalable data pipelines and build new Data Source integrations to support increasing data volume and complexity. Experience in creating Technical Specification Design , Application Interface Design. Developing Modern Data Warehouse solutions using Azure Stack ( Azure Data Lake , Azure Databricks) and PySpark Develop batch processing and integration solutions and process Structured and Non-Structured Data Demonstrated in-depth skills with Azure Databricks and PySpark, and SQL Collaborate and engage with BI & analytics and the business team Job Types: Full-time, Permanent Pay: From ₹1,500,000.00 per year Schedule: Fixed shift Application Question(s): How soon you can join? What is your current CTC? What is your expected CTC? What is your current location? Are you comfortable for Hyderabad location? How many years of experience do you have in Azure Databricks? How many years of experience do you have in Azure Data Factory? How many years of experience do you have in PySpark? Experience: total work: 6 years (Required) Work Location: In person
Posted 1 week ago
55.0 years
6 - 9 Lacs
Pune
Remote
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.
Posted 1 week ago
10.0 years
5 - 10 Lacs
Bengaluru
On-site
Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-45392-2025 Description & Requirements Introduction: A Career at HARMAN HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About the Role We are seeking an experienced “Azure Data Architect” who will develop and implement data engineering project including enterprise data hub or Big data platform. Develop and implement data engineering project including data lake house or Big data platform What You Will Do Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as AWS 3/ AWS glue , AWS Redshift, Iceberg/parquet file format Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. What You Need 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Develop and implement data engineering project including data lakehouse or Big data platform Develop and implement data engineering project including data lakehouse or Big data platform What is Nice to Have Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Snowflake Certified in SnowPro Advanced Certification Ability to define reference data architecture Cloud native data platform experience in AWS or Microsoft stack Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders Build re-usable Methodologies, Pipelines & Models What Makes You Eligible Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.). Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Posted 1 week ago
0 years
6 - 9 Lacs
Bengaluru
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant-Databricks! We are on the lookout for an Analytics Engineer to join our team, tasked with integrating Putnam analytics solutions into FT systems. The ideal candidate should have a genuine passion for maintaining, optimizing, and enhancing analytics solutions. Responsibilities Develop and sustain optimal analytics solutions by sourcing data from multiple sources. Create and maintain analytical processes that utilize processed data to offer actionable insights based on customer requirements. Collaborate with tech leaders to support, maintain, and enhance existing analytics solutions. The successful candidate will be responsible for supporting and enhancing current analytics engineering solutions developed using Python and implemented via Databricks. Experience in developing analytics solutions using Databricks. Qualification we seek in you! Minimum Qualifications Proven skill sets in AWS Data Lake services such as - AWS Glue, S3 Lambda, SNS, IAM, and skills inSpark, Python, and SQL. Proven experience working with diverse data sources, writing queries (SQL), and preparing data using Python and data frames. Experience in developing analytics solutions using Databricks. Demonstrated ability to support and collaborate with cross-functional teams in a dynamic work environment. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 9, 2025, 4:05:07 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 week ago
0 years
6 - 9 Lacs
Bengaluru
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant-Databricks! We are on the lookout for an Analytics Engineer to join our team, tasked with integrating Putnam analytics solutions into FT systems. The ideal candidate should have a genuine passion for maintaining, optimizing, and enhancing analytics solutions. Responsibilities Develop and sustain optimal analytics solutions by sourcing data from multiple sources. Create and maintain analytical processes that utilize processed data to offer actionable insights based on customer requirements. Collaborate with tech leaders to support, maintain, and enhance existing analytics solutions. The successful candidate will be responsible for supporting and enhancing current analytics engineering solutions developed using Python and implemented via Databricks. Experience in developing analytics solutions using Databricks. Qualification we seek in you! Minimum Qualifications Proven skill sets in AWS Data Lake services such as - AWS Glue, S3 Lambda, SNS, IAM, and skills inSpark, Python, and SQL. Proven experience working with diverse data sources, writing queries (SQL), and preparing data using Python and data frames. Experience in developing analytics solutions using Databricks. Demonstrated ability to support and collaborate with cross-functional teams in a dynamic work environment. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 9, 2025, 3:58:24 AM Unposting Date Jun 14, 2025, 1:29:00 PM Master Skills List Digital Job Category Full Time
Posted 1 week ago
15.0 years
6 - 7 Lacs
Bengaluru
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Position Summary: In Assurance, there is a huge focus on Data driven Audit. Audit teams are moving away from sample-based audits to full data Audit by leveraging EY Audit Platforms. GDS Assurance Data Analytics team plays a crucial role in helping EY Audit teams to leverage data for Audit. This team works with Onshore Audit team members and Clients in identifying the right data required for Audit, extract data from Client ERP Systems, transform it and make it ready for Audit teams to analyze the data. This team requires a Data Analytics Leader that has both Assurance and Technology skill sets and who can work with Area/Global Data Analytics Leaders to bring standardization, drive automation and centralize data delivery from GDS and drive growth in Data Analytics delivery Essential Functions of the Job: EY GDS Assurance Data Analytics Leader role will: Partner with Area and Regional Data Analytics Delivery Leaders and identify opportunities for Data Delivery from GDS and grow the business. Standardize Data Delivery process across areas by identifying best practices from each area, identify opportunities for improvement and put together a consistent way of delivery. Manage stakeholders across areas/regions, deliver with quality, ensure proper reviews, meet TAT requirements and gain customer satisfaction Lead and manage a growing team, ensure proper business and technical trainings, manage career aspirations and progressions Automate repeatable manual activities by developing automation solutions, reduce turn around time and drive efficiencies Transform Data Analytics Delivery by reimagining processes and developing new solutions using AI, GenAI and Cloud based technologies Identify and grow Data Analytics opportunities in SSLs like FAAS, CCaSS and Forensics. Analytical/Decision Making Responsibilities: This role needs to lead a large, distributed team across geographies and time zones and manage stakeholder expectations. It requires a growth mindset to identify new opportunities for growth, explore art of possible in new areas and setup teams to deliver. Data Delivery is a seasonal work, during busy seasons – it can happen that we end up getting huge volumes of orders in short span of 2-3 busy weeks. Data Analytics Leader need to make sure that the team is properly trained for the busy season, appropriately motivated and fully equipped to turn around the data quickly Knowledge and Skills Requirements: (Describe the knowledge or skills needed to perform this job; these may be technical, managerial or behavioral in nature.) This role requires a combination of technical and Audit experience. It requires experience in technologies like Alteryx, SQL, Power BI, ERPs, Azure, AI/Gen AI etc., It requires experience in handling large volumes of Data Analytics Delivery managing senior level stakeholders, meeting SLAs and ensuring delivery. Supervision Responsibilities: Data Analytics Leader manages a large Globally distributed Data Analytics team with multiple Senior Managers/Managers as direct reports. It requires the individual to grow the business, hire talent from market, ensure proper training, deliver Data Analytics Projects and manage stakeholders. This position reports into GDS Assurance Digital Leader in GDS Assurance. Job Requirements : Education: B.E/B.Tech/MCA Experience: 15+ years’ experience (demonstrated competence, depth, and breadth) in Leadership roles in technology consulting/delivery/product development experience 3+ years of Experience in Data Analytics Delivery role especially leading large Data Analytics delivery teams Strong technical experience in AI, Azure, Databricks, Alteryx, SQL and Power BI EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru
On-site
Company Description The Bosch Group is a leading global supplier of technology and services, in the areas of Automotive Technology, Industrial Technology, Consumer Goods, Energy and Building Technology. In India, the Group operates through nine companies with a combined strength of over 30,000 associates which includes around 14,000 research and development associates. Bosch Automotive Electronics India Pvt. Ltd. (RBAI) is a 100% subsidiary of Robert Bosch GmbH. RBAI was established at the right time to cater to the demands of future Indian market. Established in 2009, started out with manufacturing Electronic Control Units. On an average adding one new product every year, Antenna and Immobilizer in 2011, wide range of BCM's since 2012, Electronic power steering control units from 2013, and Voltage regulator in 2014. Over the last 7 years of its existence, the company has grown over 44% CAGR, which is remarkable considering it was established during the peak of recession. The product portfolio of Bosch Automotive Electronics Pvt. Ltd. is into both Automotive and Non-Automotive Business catering to local as well as global demands. The products from RBAI fulfils 94% of the local demand. Apart from this, 72% of our sales are towards exports covering most of the global market. We invite promising and dynamic professionals for a long-term and rewarding career with Bosch. Job Description As a Data engineer in Operations, you will work on the operational management, monitoring, and support of scalable data pipelines running in Azure Databricks, Hadoop and Radium. You will ensure the reliability, performance, and availability of data workflows and maintain production environments. You will collaborate closely with data engineers, architects, and platform teams to implement best practices in data pipeline operations and incident management to ensure data availability and data completeness. Primary responsibilities: Operational support and incident management for Azure Databricks, Hadoop, Radium data pipelines. Collaborating with data engineering and platform teams to define and enforce operational standards, SLAs, and best practices. Designing and implementing monitoring, alerting, and logging solutions for Azure Databricks pipelines. Coordinating with central teams to ensure compliance with organizational operational standards and security policies. Developing and maintaining runbooks, SOPs, and troubleshooting guides for pipeline issues. Managing the end-to-end lifecycle of data pipeline incidents, including root cause analysis and remediation. Overseeing pipeline deployments, rollbacks, and change management using CI/CD tools such as Azure DevOps. Ensuring data quality and validation checks are effectively monitored in production. Working closely with platform and infrastructure teams to address pipeline and environment-related issues. Providing technical feedback and mentoring junior operations engineers. Conducting peer reviews of operational scripts and automation code. Automating manual operational tasks using Scala and Python scripts. Managing escalations and coordinating critical production issue resolution. Participating in post-mortem reviews and continuous improvement initiatives for data pipeline operations. · Qualifications Bachelor’s degree in Computer Science, Computer Engineering, or a relevant technical field 3+ years’ experience in data engineering, ETL tools, and working with large-scale data sets in Operations. Proven experience with cloud platforms, particularly Azure Databricks. Minimum 3 years of hands-on experience working with distributed cluster environments (e.g., Spark clusters). Strong operational experience in managing and supporting data pipelines in production environments. Additional Information Key Competencies: Experience in Azure Databricks operations or data pipeline support. Understanding of Scala/ Python programming for troubleshooting in Spark environments. Hands-on experience with Delta Lake, Azure Data Lake Storage (ADLS), DBFS, Azure Data Factory (ADF). Solid understanding of distributed data processing frameworks and streaming data operations. Understanding and hands-on usage of Kafka as message broker Experience with Azure SQL Database and cloud-based data services. Strong skills in monitoring tools like Splunk, ELK and Grafana, alerting frameworks, and incident management. Experience working with CI/CD pipelines using Azure DevOps or equivalent. Excellent problem-solving, investigative, and troubleshooting skills in large-scale data environments. Experience defining operational SLAs and implementing proactive monitoring solutions. Familiarity with data governance, security, and compliance best practices in cloud data platforms. Strong communication skills and ability to work independently under pressure. Soft Skills: Good Communication Skills, extensive usage of MS-Teams Experience in using Azure board and JIRA Decent Level in English as Business Language
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.
The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect
In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools
As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16951 Jobs | Dublin
Wipro
9154 Jobs | Bengaluru
EY
7414 Jobs | London
Amazon
5846 Jobs | Seattle,WA
Uplers
5736 Jobs | Ahmedabad
IBM
5617 Jobs | Armonk
Oracle
5448 Jobs | Redwood City
Accenture in India
5221 Jobs | Dublin 2
Capgemini
3420 Jobs | Paris,France
Tata Consultancy Services
3151 Jobs | Thane