Jobs
Interviews

6639 Databricks Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Utilizing expertise in Power Apps, Power Pages, Power Automate, and Power Virtual Agent development. Designing and creating custom business apps, such as Canvas Apps, SharePoint Form Apps, Model Driven Apps, and Portals/Power Pages Portal. Implementing various Power Automate Flows, including Automated, Instant, Business Process Flow, and UI Flows. Collaborating with backend teams to integrate Power Platform solutions with SQL Server and SPO. Demonstrating strong knowledge of Dataverse, including security and permission levels. Developing and utilizing custom connectors in Power Platform solutions. Creating and consuming functions/API's to retrieve/update data from the database. Managing managed solutions to ensure seamless deployment and version control. Experience in Azure DevOps CI/CD deployment Pipelines. Monitoring and troubleshooting any performance bottlenecks. Having any coding/programming experience is a plus. Excellent communication skills. Requirements 6-9 years of relevant experience. Strong hands-on work experience with Power Pages and Model Driven Apps with Dataverse. Experience in Azure DevOps CI/CD deployment Pipelines. Good communication skills. Mandatory Skill Sets Strong hands-on work experience with Power Pages and Model Driven Apps with Dataverse. Preferred Skill Sets Experience in Azure DevOps CI/CD deployment Pipelines. Years Of Experience Required 5 years to 9 years Education Qualification Bachelor's degree in Computer Science, Engineering, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Power Apps Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Summary Job Title: AI/ML Engineer Location: TechM Blr ECITY Years of Experience: 10+ Years Job Summary Chevron invites applications for the role of AI/ML Engineer within our Enterprise AI team in India. This position is integral to designing and developing AI/ML models that significantly accelerate the delivery of business value. We are looking for a Machine Learning Engineer with the ability to bring their expertise, innovative attitude, and excitement for solving complex problems with modern technologies and approaches. We seek individuals with a passion for exploring, innovating, and delivering innovative Data Science solutions that provide immense value to our business. Responsibilities Design and develop AI/ML models to enhance business processes and deliver actionable insights. Implement machine learning frameworks and libraries, ensuring robust model performance and scalability. Collaborate with cross functional teams to integrate AI solutions into existing workflows. Manage the lifecycle of machine learning models, including training, validation, deployment, and monitoring. Develop and maintain custom APIs for machine learning models to facilitate training and inference. Utilize Azure services to build and deploy machine learning pipelines effectively. Engage with technical experts to identify opportunities for applying machine learning and analytics. Communicate findings and insights clearly to stakeholders at all levels. Mandatory Skills Minimum 5 years of experience in Object Oriented Programming in Python. Proven experience with Azure IaaS services, particularly in building machine learning pipelines using Azure Machine Learning and/or Fabric. Strong understanding of software engineering principles, including source control, architecture, and testing methodologies. Experience with containers and container management (Docker, Kubernetes). Proficient in orchestrating large scale ML/DL jobs and leveraging Modern Data Platform tooling. Experience in designing custom APIs for machine learning models. Knowledge of mathematics (linear algebra, probability, Statistics) and algorithms. Ability to communicate effectively in both oral and written forms. Preferred Skills Experience implementing machine learning frameworks such as MLflow. Familiarity with Data Engineering and transformation tools like Azure Databricks, Spark, and Azure ADF. History of working with large scale model optimization and Neural Networks Hyper Parameter Tuning. Experience with unstructured data using Azure Cognitive Services and/or Computer Vision. Understanding of enterprise SaaS complexities, including security, scalability, and production support. Qualifications Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 7 10 years of relevant experience in AI/ML engineering. Strong problem solving skills and a methodical approach to software design and development.

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Job Description: Key Responsibilities: Designs, implements and maintains reliable and scalable data infrastructure Writes, deploys and maintains software to build, integrate, manage, maintain, and quality-assure data Designs, develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes Works with customers to deploy, manage, and audit standard processes for cloud products Adheres to and advocates for software & data engineering standard processes (e.g. technical design and review, unit testing, monitoring, alerting, source control, code review & documentation) Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain, responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. Part of a cross-disciplinary team working closely with other data engineers, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Job Requirements: Education : Bachelor or higher degree in computer science, Engineering, Information Systems or other quantitative fields Experience : Years of experience: 8 to 12 years relevant experience Deep and hands-on experience designing, planning, productionizing, maintaining and documenting reliable and scalable data infrastructure and data products in complex environments Hands on experience with: Spark for data processing (batch and/or real-time) Configuring Delta Lake on Azure Databricks Languages: SQL, pyspark, python Cloud platforms: Azure Azure Data Factory (must) , Azure Data Lake (must), Azure SQL DB (must), Synapse (must), SQL Pools (must), Databricks (good to have) Designing data solutions in Azure incl. data distributions and partitions, scalability, cost-management, disaster recovery and high availability Azure Devops (or similar tools) for source control & building CI/CD pipelines Experience designing and implementing large-scale distributed systems Customer management and front-ending and ability to lead large organizations through influence Desirable Criteria : Strong customer management- own the delivery for Data track with customer stakeholders Continuous learning and improvement attitude Key Behaviors : Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other Mandatory Skill Sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark, spark-SQL Preferred Skill Sets: ‘Good to have’ knowledge, skills and experiences Cosmos DB, Data modeling, Databricks, PowerBI, experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years Of Experience Required: 8 to 12 years relevant experience Education Qualification: BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology, Master of Business Administration, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Apache Synapse Optional Skills Microsoft Power Business Intelligence (BI) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Sigmoid: Sigmoid empowers enterprises to make smarter, data-driven decisions by blending advanced data engineering with AI consulting. We collaborate with some of the world’s leading data-rich organizations across sectors such as CPG-retail, BFSI, life sciences, manufacturing, and more to solve complex business challenges. Our global team specializes in cloud data modernization, predictive analytics, generative AI, and DataOps, supported by 10+ delivery centers and innovation hubs, including a major global presence in Bengaluru and operations across the USA, Canada, UK, Netherlands, Poland, Singapore, and India. Recognized as a leader in the data and analytics space, Sigmoid is backed by Peak XV Partners and has consistently received accolades for innovation and rapid growth. Highlights include being named a ‘Leader’ in ISG’s Specialty Analytics Services for Supply Chain (2024), a two-time ‘India Future Unicorn’ by Hurun India, and a four-time honoree on both the Inc. 500 and Deloitte Technology Fast 500 lists. Director - Data Analytics: This role will be a leadership position in the data science group at Sigmoid. An ideal person will come from a services industry background with a good mix of experience in solving complex business intelligence and data analytics problems, team management, delivery management and customer handling. This position will give you an immense opportunity to work on challenging business problems faced by fortune 500 companies across the globe. The role is part of the leadership team and includes accountability for a part of her/his team and customers. The person is expected to be someone who can contribute in developing the practice with relevant experience in the domain, nurturing the talent in the team and working with customers to grow accounts. Responsibilities Include Build trust with senior stakeholders through strategic insight and delivery credibility. Ability to translate ambiguous client business problems into BI solutions and ability to implement them. Oversight of multi-client BI and analytics programs with competing priorities and timelines, while collaborating with Data Engineering and other functions on a common goal. Ensure scalable, high-quality deliverables aligned with business impact. Help recruiting and onboarding team members; directly manage 15 - 20 team members. You would be required to own customer deliverables and ensure, along with project managers, that the project schedules are in line with the expectations set to the customers. Experience and Qualifications 15+ years of overall experience with a minimum of 10+ years in data analytics execution. Strong organizational and multitasking skills with the ability to balance multiple priorities. Highly analytical with the ability to collate, analyze and present data and drive clear insights to lead decisions that improve KPIs. Ability to effectively communicate and manage relationships with senior management, other departments and partners. Mastery of BI tools (Power BI, Tableau, Qlik), backend systems (SQL, ETL frameworks) and data modeling. Experience with cloud-native platforms (Snowflake, Databricks, Azure, AWS), data lakes. Expertise in managing compliance, access controls, and data quality frameworks is a plus. Experience working in CPG, Supply Chain, Manufacturing and Marketing domains are a plus. Strong problem-solving skills and ability to prioritize conflicting requirements. Excellent written and verbal communication skills and ability to succinctly summarize the key findings.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Analytics Engineer We are seeking a talented, motivated and self-driven professional to join the HH Digital, Data & Analytics (HHDDA) organization and play an active role in Human Health transformation journey to become the premier “Data First” commercial biopharma organization. As a Analytics Engineer, you will be part of the HHDDA Commercial Data Solutions team, providing technical/data expertise development of analytical data products to enable data science & analytics use cases. In this role, you will create and maintain data assets/domains used in the commercial/marketing analytics space – to develop best-in-class data pipelines and products, working closely with data product owners to translate data product requirements and user stories into development activities throughout all phases of design, planning, execution, testing, deployment and delivery. Your specific responsibilities will include Hands-on development of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field Required Experience 5+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Good understanding and comprehension of the requirements provided by Data Product Owner and Lead Analytics Engineer Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred Experience Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science and visualization/reporting products, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI) Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Management, Data Modeling, Data Visualization, Measurement Analysis, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 07/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R335382

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Specialist- Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience And Skills 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R335129

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Skill: AWS Databricks Developer Experinece: 5-12yrs Notice period: Imm -30 Days Mandate Skills: AWS, Databricks, Python, SQL Responsibility: Designing and implementing scalable data pipelines using Databricks and Apache Spark. Proficiency in programming languages such as Python and SQL. Analysing and processing large datasets to uncover actionable insights. Integrating data flows across AWS services to ensure seamless connectivity. Collaborating with cross-functional teams to streamline data operations and workflows.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As an Ignition Application Administrator at EY, you will be a key member of the Enterprise Services Data team. Your role will involve collaborating closely with peer platform administrators, developers, Product/Project Seniors, and Customers to administer the existing analytics platforms. While focusing primarily on Ignition, you will also be cross-trained on other tools such as Qlik Sense, Tableau, PowerBI, SAP Business Objects, and more. Your willingness to tackle complex problems and find innovative solutions will be crucial in this role. In this position, you will have the opportunity to work in a start-up-like environment within a Fortune 50 company, driving digital transformation and leveraging insights to enhance products and services. Your responsibilities will include installing and configuring Ignition, monitoring the platform, troubleshooting issues, managing data source connections, and contributing to the overall data platform architecture and strategy. You will also be involved in integrating Ignition with other ES Data platforms and Business Unit installations. To succeed in this role, you should have at least 3 years of experience in customer success or a customer-facing engineering capacity, along with expertise in large-scale implementations and complex solutions environments. Experience with Linux command line, cloud operations, Kubernetes application deployment, and cloud platform architecture is essential. Strong communication skills, both interpersonal and written, are also key for this position. Ideally, you should hold a BA/BS Degree in technology, computing, or a related field, although relevant work experience may be considered in place of formal education. The position may require flexibility in working hours, including weekends, to meet deadlines and fulfill application administration obligations. Join us at EY and contribute to building a better working world by leveraging data, technology, and your unique skills to drive innovation and growth for our clients and society.,

Posted 3 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have at least 2 years of professional work experience in implementing data pipelines using Databricks and datalake. A minimum of 3 years of hands-on programming experience in Python within a cloud environment (preferably AWS) is necessary for this role. Having 2 years of professional work experience with real-time streaming systems such as Event Grid and Event topics would be highly advantageous. You must possess expert-level knowledge of SQL to write complex, highly-optimized queries for processing large volumes of data effectively. Experience in developing conceptual, logical, and/or physical database designs using tools like ErWin, Visio, or Enterprise Architect is expected. A minimum of 2 years of hands-on experience working with databases like Snowflake, Redshift, Synapse, Oracle, SQL Server, Teradata, Netezza, Hadoop, MongoDB, or Cassandra is required. Knowledge or experience in architectural best practices for building data lakes is a must for this position. Strong problem-solving and troubleshooting skills are necessary, along with the ability to make sound judgments independently. You should be capable of working independently and providing guidance to junior data engineers. If you meet the above requirements and are ready to take on this challenging role, we look forward to your application. Warm Regards, Rinka Bose Talent Acquisition Executive Nivasoft India Pvt. Ltd. Mobile: +91-9632249758 (INDIA) | 732-334-3491 (U.S.A) Email: rinka.bose@nivasoft.com | Web: https://nivasoft.com/,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Adobe is seeking dedicated Product Analytics Experts to join our growing team in Noida. In this role, you will play key part in driving the success of Adobe's Document Cloud products by using your expertise to understand user behavior, identify growth opportunities, and help drive data-driven decisions. Responsibilities: Analyze large datasets to identify trends, patterns, and key performance indicators . Develop and maintain SQL queries to extract, transform, and load data from various sources, including Hadoop and cloud-based platforms like Databricks. Develop compelling data visualizations using Power BI and Tableau to communicate insights seamlessly to PMs/ Engineering and leadership. Conduct A/B testing and campaign analysis, using statistical methods to measure and evaluate the impact of product changes. Partner with cross-functional teams (product managers, engineers, marketers) to translate data into actionable insights and drive strategic decision-making. Independently own and manage projects from inception to completion, ensuring timely delivery and high-quality results. Effectively communicate analytical findings to stakeholders at all levels, both verbally and in writing. Qualifications: 8-12 years of relevant experience in solving deep analytical challenges within a product or data-driven environment. Strong proficiency in advanced SQL, with experience working with large-scale datasets. Expertise in data visualization tools such as Power BI and Tableau. Hands-on experience in A/B testing, campaign analysis, and statistical methodologies. Working knowledge of scripting languages like Python or R, with a foundational understanding of machine learning concepts. Experience with Adobe Analytics is a significant plus. Good communication, presentation, and interpersonal skills. A collaborative mindset with the ability to work effectively within cross-functional teams. Strong analytical and problem-solving skills with a passion for data-driven decision making. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 3 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Zenoti provides an all-in-one, cloud-based software solution for the beauty and wellness industry. Our solution allows users to seamlessly manage every aspect of the business in a comprehensive mobile solution: online appointment bookings, POS, CRM, employee management, inventory management, built-in marketing programs and more. Zenoti helps clients streamline their systems and reduce costs, while simultaneously improving customer retention and spending. Our platform is engineered for reliability and scale and harnesses the power of enterprise-level technology for businesses of all sizes Zenoti powers more than 30,000 salons, spas, medspas and fitness studios in over 50 countries. This includes a vast portfolio of global brands, such as European Wax Center, Hand & Stone, Massage Heights, Rush Hair & Beauty, Sono Bello, Profile by Sanford, Hair Cuttery, CorePower Yoga and TONI&GUY. Our recent accomplishments include surpassing a $1 billion unicorn valuation, being named Next Tech Titan by GeekWire, raising an $80 million investment from TPG, ranking as the 316th fastest-growing company in North America on Deloitte’s 2020 Technology Fast 500™. We are also proud to be recognized as a Great Place to Work CertifiedTM for 2021-2022 as this reaffirms our commitment to empowering people to feel good and find their greatness. To learn more about Zenoti visit: https://www.zenoti.com What will I be doing? Design, architect, develop and maintain components of Zenoti Collaborate with a team of product managers, developers, and quality assurance engineers to define, design and deploy new features and functionality Build software that ensures the best possible usability, performance, quality and responsiveness of features Work in a team following agile development practices (SCRUM) Learn to scale your features to handle 2x to 4x growth every year and manage code that has to deal with millions of records and terabytes of data Release new features into production every month and get real feedback from thousands of customers to refine your designs Be proud of what you work on, and obsess about the quality of your work. Join our team to do the best work of your career. What skills do I need? 6+ years' experience developing ETL solutions and data pipelines with expertise in processing trillions of records efficiently 6+ years' experience with SQL Server, T-SQL, stored procedures, and deep understanding of SQL performance tuning for large-scale data processing Strong understanding of ETL concepts, data modeling, and data warehousing principles with hands-on experience building data pipelines using Python Extensive experience with Big Data platforms including Azure Fabric, Azure Databricks, Azure Data Factory (ADF), Amazon Redshift, Apache Spark, and Delta Lake Expert-level SQL skills for complex data transformations, aggregations, and query optimization to handle trillions of records with optimal performance Hands-on experience creating data lakehouse architectures and implementing data governance and security best practices across Big Data platforms Strong logical, analytical, and problem-solving skills with ability to design and optimize distributed computing clusters for maximum throughput Excellent communication skills for cross-functional collaboration and ability to work in a fast-paced environment with changing priorities Experience with cloud-native data solutions including Azure Data Lake, Azure Synapse, and containerization technologies (Docker, Kubernetes) Proven track record of implementing CI/CD pipelines for data engineering workflows, automating data pipeline deployment, and monitoring performance at scale Benefits Attractive Compensation Comprehensive medical coverage for yourself and your immediate family An environment where wellbeing is high on priority – access to regular yoga, meditation, breathwork, nutrition counseling, stress management, inclusion of family for most benefit awareness building sessions Opportunities to be a part of a community and give back: Social activities are part of our culture; You can look forward to regular engagement, social work, community give-back initiatives Zenoti provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.

Posted 3 days ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

Job Description: As an integral part of the data engineering team, you will be responsible for onboarding various data sources by creating ETL pipelines. You will provide resolutions and/or workarounds to data pipeline related queries/issues as appropriate. Ensuring that ingestion pipelines empowering the lakehouse and data mesh are up and running will be a key part of your role. You will also enable end users of the data ecosystem with query debugging and optimization. Collaboration with different teams to understand and resolve data availability and consistency issues is essential. Your efforts will focus on ensuring that teams consuming data can do so without spending the majority of their time on acquiring, cleaning, and transforming it. Additionally, you will assist other teams in becoming more independent with data analysis and data quality by coaching them with tools and practices. Continuous improvement in technical knowledge and problem-resolution skills will be expected, with a commitment to strive for excellence. You should apply if you have 1-3 years of experience in ETL and data engineering, possess the ability to read and write complex SQL, have prior experience in Python and Spark, are familiar with data modeling, data warehousing, and lakehouse (utilizing Databricks), have experience working on cloud services, preferably AWS, are dedicated to continuous learning and self-improvement, and can effectively collaborate as a team player with strong analytical, communication, and troubleshooting skills. Key Skills: - Databricks - ETL - AWS Preferred Skill: - MySQL - Python,

Posted 3 days ago

Apply

7.0 - 11.0 years

0 Lacs

gautam buddha nagar, uttar pradesh

On-site

As a member of the Optum Global Analytics Team at United Health Group, you will be involved in developing advanced analytics solutions across various verticals to enhance health outcomes and drive business growth. Your work will directly impact health equity on a global scale by connecting individuals with the care, pharmacy benefits, data, and resources necessary for improved well-being. Your primary responsibilities will include designing and deploying models, ensuring responsible use of AI, detecting and mitigating bias, conducting business analysis, creating analytical findings, engaging in client communications, mentoring junior data scientists, leading knowledge sharing sessions, overseeing technical deliveries, performing quality checks, collaborating with senior team members to develop new capabilities, and continuously building and applying technical and functional skills. To excel in this role, you must possess a bachelor's degree and have at least 7 years of experience in analytics and machine learning projects within the Business Intelligence or Business Analysis space. Proficiency in analytics tools such as R, Python, SQL, Databricks, and cloud technologies is essential. Additionally, you should have a strong understanding of Predictive & Prescriptive Analytics techniques, problem-solving skills, statistics, mathematics, data technologies, data structures, Natural Language Processing, Gen-AI, Large Language Models, and excellent communication abilities. Preferred qualifications include certification in Azure Cloud (AZ-900), familiarity with Business Intelligence tools like Tableau/Power BI, and exceptional time management, communication, decision-making, and presentation skills. If you are passionate about driving impactful solutions, possess a problem-solving approach, and are committed to continuous learning and growth, we welcome you to apply for this position at our Noida, Uttar Pradesh location. Join us in making a difference in the lives of millions as we strive to advance healthcare and well-being globally through innovative analytics solutions.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact (NYSE: G) is a global professional services and solutions firm committed to delivering outcomes that shape the future. With over 125,000 employees spread across more than 30 countries, we are fueled by our innate curiosity, entrepreneurial agility, and the aspiration to create lasting value for our clients. Driven by our purpose - the relentless pursuit of a world that works better for people - we cater to and transform leading enterprises, including the Fortune Global 500, leveraging our profound business and industry expertise, digital operations services, and proficiency in data, technology, and AI. We are currently seeking applications for the position of Lead Consultant-Data Bricks Senior Engineer! As a Lead Consultant-Data Bricks Senior Engineer, your responsibilities will include working closely with Software Designers to ensure adherence to best practices, providing suggestions for enhancing code proficiency and maintainability, occasional customer interaction to analyze user needs and determine technical requirements, designing, building, and maintaining scalable and reliable data pipelines using DataBricks, developing high-quality code focusing on performance, scalability, and security, collaborating with cross-functional teams to comprehend data requirements and deliver solutions aligning with business needs, implementing data transformations and intricate algorithms within the DataBricks environment, optimizing data processing and refining data architecture to enhance system efficiency and data quality, mentoring junior engineers, and contributing to the establishment of best practices within the team. Additionally, staying updated with emerging trends and technologies in data engineering and cloud computing is imperative. Qualifications we are looking for: Minimum Qualifications: - Experience in data engineering or a related field - Strong hands-on experience with DataBricks, encompassing development of code, pipelines, and data transformations - Proficiency in at least one programming language (e.g., Python, Scala, Java) - In-depth knowledge of Apache Spark and its integration within DataBricks - Experience with cloud services (AWS, Azure, or GCP) and their data-related products - Familiarity with CI/CD practices, version control (Git), and automated testing - Exceptional problem-solving abilities with the capacity to work both independently and as part of a team - Bachelor's degree in computer science, Engineering, Mathematics, or a related technical field If you are enthusiastic about leveraging your skills and expertise as a Lead Consultant-Data Bricks Senior Engineer, join us at Genpact and be a part of shaping a better future for all. Location: India-Kolkata Schedule: Full-time Education Level: Bachelor's / Graduation / Equivalent Job Posting: Jul 30, 2024, 5:05:42 AM Unposting Date: Jan 25, 2025, 11:35:42 PM,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Site Reliability Engineer III at JPMorgan Chase within the Corporate Technology - Capital Management, you play a crucial role in shaping the future of a globally recognized organization. Your impact is direct and significant in a sphere tailored for high achievers in site reliability. You will tackle complex and wide-ranging business challenges with simple and effective solutions through code and cloud infrastructure. Your responsibilities include configuring, maintaining, monitoring, and optimizing applications and associated infrastructure. You will independently break down and enhance existing solutions iteratively, making you a key contributor to your team. Your primary responsibilities involve driving continuous enhancement of reliability, monitoring, and alerting for mission-critical microservices. You will automate tasks to reduce manual effort, creating reliable infrastructure and tools to expedite feature development. By developing and implementing metrics for microservices, defining user journeys, SLOs, and error budgets, and configuring dashboards and alerts, you ensure blameless post-mortems for permanent incident closure. Collaboration with development teams throughout the software lifecycle is essential to enhance reliability and scale, design self-healing patterns, and implement infrastructure, configuration, and network as code. You will work closely with software engineers to design and implement deployment approaches using automated CI/CD pipelines and promote site reliability engineering best practices. Your role involves demonstrating and advocating for a site reliability culture and practices, leading initiatives to improve application and platform reliability and stability through data-driven analytics. Collaborating with team members to identify service level indicators, establish reasonable service level objectives, and proactively resolve issues before customer impact are critical aspects of your work. Additionally, you will act as the main point of contact during major incidents, utilizing technical expertise to swiftly identify and resolve issues while sharing knowledge within the organization. To excel in this role, you are required to have formal training or certification in site reliability concepts along with at least 5 years of applied experience in public cloud platforms like AWS, Azure, or GCP. Proficiency in a programming language such as Python, Go, or Java/Spring Boot is necessary, with expertise in software design, coding, testing, and delivery. Experience with Kubernetes, cloud computing, and relational databases like Oracle or MySQL is preferred. You should possess excellent debugging and troubleshooting skills and familiarity with common SRE toolchains like Grafana, Prometheus, ELK Stack, Kibana, and Jaeger. Proficiency in continuous integration and continuous delivery tools such as Jenkins, GitLab, or Terraform, and observability tools like Dynatrace, Datadog, New Relic, CloudWatch, or Splunk is also important. Moreover, your skills should include familiarity with ETL tools like Databricks, experience with container and container orchestration technologies such as ECS, Kubernetes, and Docker, and a deep proficiency in reliability, scalability, performance, security, enterprise system architecture, and toil reduction. You should be able to identify and solve problems related to complex data structures and algorithms, troubleshoot common networking technologies and issues, and be driven to self-educate and evaluate new technologies. Teaching new programming languages to team members, contributing to large and collaborative teams, recognizing roadblocks proactively, and showing interest in learning technology that drives innovation are further expectations of this role.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Do you have in-depth experience in Nat Cat models and tools Do you enjoy being part of a distributed team of Cat Model specialists with diverse backgrounds, educations, and skills Are you passionate about researching, debugging issues, and developing tools from scratch We are seeking a curious individual to join our NatCat infrastructure development team. As a Cat Model Specialist, you will collaborate with the Cat Perils Cat & Geo Modelling team to maintain models, tools, and applications used in the NatCat costing process. Your responsibilities will include supporting model developers in validating their models, building concepts and tools for exposure reporting, and assisting in model maintenance and validation. You will be part of the Cat & Geo Modelling team based in Zurich and Bangalore, which specializes in natural science, engineering, and statistics. The team is responsible for Swiss Re's global natural catastrophe risk assessment and focuses on advancing innovative probabilistic and proprietary modelling technology for earthquakes, windstorm, and flood hazards. Main Tasks/Activities/Responsibilities: - Conceptualize and build NatCat applications using sophisticated analytical technologies - Collaborate with model developers to implement and test models in the internal framework - Develop and implement concepts to enhance the internal modelling framework - Coordinate with various teams for successful model and tool releases - Provide user support on model and tools related issues - Install and maintain the Oasis setup and contribute to the development of new functionality - Coordinate platform setup and maintenance with 3rd party vendors About You: - Graduate or Post-Graduate degree in mathematics, engineering, computer science, or equivalent quantitative training - Minimum 5 years of experience in the Cat Modelling domain - Reliable, committed, hands-on, with experience in Nat Cat modelling - Previous experience with catastrophe models or exposure reporting tools is a plus - Strong programming skills in MATLAB, MS SQL, Python, Pyspark, R - Experience in consuming WCF/RESTful services - Knowledge of Business Intelligence, reporting, and data analysis solutions - Experience in agile development environment is beneficial - Familiarity with Azure services like Storage, Data Factory, Synapse, and Databricks - Good interpersonal skills, self-driven, and ability to work in a global team - Strong analytical and problem-solving skills About Swiss Re: Swiss Re is a leading provider of reinsurance, insurance, and insurance-based risk transfer solutions. With over 14,000 employees worldwide, we anticipate and manage various risks to make the world more resilient. We cover a wide range of risks from natural catastrophes to cybercrime, offering solutions in both Property & Casualty and Life & Health sectors. If you are an experienced professional returning to the workforce after a career break, we welcome you to apply for positions that match your skills and experience.,

Posted 3 days ago

Apply

5.0 years

0 Lacs

India

Remote

Where you’ll work: India (Remote) Engineering at GoTo We’re the trailblazers of remote work technology. We build powerful, flexible work software that empowers everyone to live their best life, at work and beyond. And blaze even more trails along the way. There’s ample room for growth – so you can blaze your own trail here too. When you join a GoTo product team, you’ll take on a key role in this process and see your work be used by millions of users worldwide. Your Day to Day As a Senior Data Engineer, you would be: Design and Develop Pipelines : Build robust, scalable, and efficient ETL/ELT data pipelines to process structured data from diverse sources. Big Data Processing : Develop and optimize large-scale data workflows using Apache Spark, with strong hands-on experience in building ETL pipelines. Cloud-Native Data Solutions : Architect and implement data solutions using AWS services such as S3, EMR, Lambda, and EKS. Data Governance : Manage and govern data using catalogs like Hive or Unity Catalog; ensure strong data lineage, access controls, and metadata management. Workflow Orchestration : Schedule, monitor, and orchestrate workflows using Apache Airflow or similar tools. Data Quality & Monitoring : Implement quality checks, logging, monitoring, and alerting to ensure pipeline reliability and visibility. Cross-Functional Collaboration : Partner with analysts, data scientists, and business stakeholders to deliver high-quality data for applications and enable self-service BI. Compliance & Security : Uphold best practices in data governance, security, and compliance across the data ecosystem. Mentorship & Standards : Mentor junior engineers and help evolve engineering practices including CI/CD, testing, and documentation. What We’re Looking For As a Senior Data Engineer, your background will look like: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering or software development, with a proven record of maintaining production-grade pipelines. Proficient in Python and SQL for data transformation and analytics. Strong expertise in Apache Spark , including data lake management, ACID transactions, schema enforcement/evolution, and time travel. In-depth knowledge of AWS services —especially S3, EMR, Lambda, and EKS—with a solid grasp of cloud architecture and security best practices. Solid data modeling skills (dimensional, normalized) and an understanding of data warehousing and lakehouse paradigms. Experience with BI tools like Tableau or Power BI . Familiar with setting up data quality , monitoring, and observability frameworks. Excellent communication and collaboration skills, with the ability to thrive in an agile and multicultural team environment. Nice to Have Experience working on the Databricks Platform Knowledge of Delta or Apache Iceberg file formats Passion for Machine Learning and AI; enthusiasm to explore and apply intelligent systems. What We Offer At GoTo, we believe in supporting our employees with a comprehensive range of benefits designed to fit your life—at work and beyond. Here are just some of the benefits and perks you can expect when you join our team: Comprehensive health benefits, life and disability insurance, and fertility and family-forming support program Generous paid time off, paid holidays, volunteer time off, and quarterly self-care days and no meeting days Tuition and reading reimbursement programs to support your continuous learning and professional growth Thrive Global Wellness Program, confidential Employee Assistance Program (EAP), as well as One to One Wellness Coaching Employee programs—including Employee Resource Groups (ERGs), GoTo Gives, and our charitable matching program—to amplify your connection and impact Registered Retirement Savings Plan (RRSP) to help you plan for your future GoTo performance bonus program to celebrate your impact and contributions Monthly remote work stipend to support your home office expenses At GoTo, you’ll find the flexibility, resources, and support you need to thrive—at work, at home, and everywhere in between. You’ll work towards a shared goal with an open-minded, cohesive team that’s greater than the sum of its parts. We’re committed to creating an inclusive space for everyone, because we know unique perspectives make us a stronger company and community. Join us and be part of a company that invests in your future, where together we’ll Be Real, Think Big, Move Fast, Keep Growing, and stay Customer Obsessed .Learn more.

Posted 3 days ago

Apply

2.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,

Posted 3 days ago

Apply

0.0 years

0 Lacs

Varthur, Bengaluru, Karnataka

On-site

Outer Ring Road, Devarabisanahalli Vlg Varthur Hobli, Bldg 2A, Twr 3, Phs 1, BANGALORE, IN, 560103 INFORMATION TECHNOLOGY 4230 Band B Satyanarayana Ambati Job Description Application Developer Bangalore, Karnataka, India AXA XL offers risk transfer and risk management solutions to clients globally. We offer worldwide capacity, flexible underwriting solutions, a wide variety of client-focused loss prevention services and a team-based account management approach. AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained advantage. What you’ll be DOING What will your essential responsibilities include? We are seeking an experienced ETL Developer to support and evolve our enterprise data integration workflows. The ideal candidate will have deep expertise in Informatica PowerCenter, strong hands-on experience with Azure Data Factory and Databricks, and a passion for building scalable, reliable ETL pipelines. This role is critical for both day-to-day operational reliability and long-term modernization of our data engineering stack in the Azure cloud. Key Responsibilities: Maintain, monitor, and troubleshoot existing Informatica PowerCenter ETL workflows to ensure operational reliability and data accuracy. Enhance and extend ETL processes to support new data sources, updated business logic, and scalability improvements. Develop and orchestrate PySpark notebooks in Azure Databricks for data transformation, cleansing, and enrichment. Configure and manage Databricks clusters for performance optimization and cost efficiency. Implement Delta Lake solutions that support ACID compliance, versioning, and time travel for reliable data lake operations. Automate data workflows using Databricks Jobs and Azure Data Factory (ADF) pipelines. Design and manage scalable ADF pipelines, including parameterized workflows and reusable integration patterns. Integrate with Azure Blob Storage and ADLS Gen2 using Spark APIs for high-performance data ingestion and output. Ensure data quality, consistency, and governance across legacy and cloud-based pipelines. Collaborate with data analysts, engineers, and business teams to deliver clean, validated data for reporting and analytics. Participate in the full Software Development Life Cycle (SDLC) from design through deployment, with an emphasis on maintainability and audit readiness. Develop maintainable and efficient ETL logic and scripts following best practices in security and performance. Troubleshoot pipeline issues across data infrastructure layers, identifying and resolving root causes to maintain reliability. Create and maintain clear documentation of technical designs, workflows, and data processing logic for long-term maintainability and knowledge sharing. Stay informed on emerging cloud and data engineering technologies to recommend improvements and drive innovation. Follow internal controls, audit protocols, and secure data handling procedures to support compliance and operational standards. Provide accurate time and effort estimates for assigned development tasks, accounting for complexity and risk. What you will BRING We’re looking for someone who has these abilities and skills: Advanced experience with Informatica PowerCenter, including mappings, workflows, session tuning, and parameterization Expertise in Azure Databricks + PySpark, including: Notebook development Cluster configuration and tuning Delta Lake (ACID, versioning, time travel) Job orchestration via Databricks Jobs or ADF Integration with Azure Blob Storage and ADLS Gen2 using Spark APIs Strong hands-on experience with Azure Data Factory: Building and managing pipelines Parameterization and dynamic datasets Notebook integration and pipeline monitoring Proficiency in SQL, PL/SQL, and scripting languages such as Python, Bash, or PowerShell Strong understanding of data warehousing, dimensional modeling, and data profiling Familiarity with Git, CI/CD pipelines, and modern DevOps practices Working knowledge of data governance, audit trails, metadata management, and compliance standards such as HIPAA and GDPR Effective problem-solving and troubleshooting skills with the ability to resolve performance bottlenecks and job failures Awareness of Azure Functions, App Services, API Management, and Application Insights Understanding of Azure Key Vault for secrets and credential management Familiarity with Spark-based big data ecosystems (e.g., Hive, Kafka) is a plus Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars: Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society – are essential to our future. We’re committed to protecting and restoring nature – from mangrove forests to the bees in our backyard – by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action : We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day – the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh

Remote

Principal Software Engineering Manager- Data Engineering Noida, Uttar Pradesh, India Date posted Jul 30, 2025 Job number 1851293 Work site Up to 50% work from home Travel 0-25 % Role type People Manager Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers to levels they cannot achieve anywhere else. This is a world of more possibilities, more innovation, more openness in a cloud-enabled world. The Business & Industry Copilots group is a rapidly growing organization that is responsible for the Microsoft Dynamics 365 suite of products, Power Apps, Power Automate, Dataverse, AI Builder, Microsoft Industry Solution and more. Microsoft is considered one of the leaders in Software as a Service in the world of business applications and this organization is at the heart of how business applications are designed and delivered. This is an exciting time to join our group Customer Experience (CXP) and work on something highly strategic to Microsoft. The goal of CXP Engineering is to build the next generation of our applications running on Dynamics 365, AI, Copilot, and several other Microsoft cloud services to drive AI transformation across Marketing, Sales, Services and Support organizations within Microsoft. We innovate quickly and collaborate closely with our partners and customers in an agile, high-energy environment. Leveraging the scalability and value from Azure & Power Platform, we ensure our solutions are robust and efficient. Our organization’s implementation acts as reference architecture for large companies and helps drive product capabilities. If the opportunity to collaborate with a diverse engineering team, on enabling end-to-end business scenarios using cutting-edge technologies and to solve challenging problems for large scale 24x7 business SaaS applications excite you, please come and talk to us! We are hiring a passionate Principal SW Engineering Manager to lead a team of highly motivated and talented software developers building highly scalable data platforms and deliver services and experiences for empowering Microsoft’s customer, seller and partner ecosystem to be successful. This is a unique opportunity to use your leadership skills and experience in building core technologies that will directly affect the future of Microsoft on the cloud. In this position, you will be part of a fun-loving, diverse team that seeks challenges, loves learning and values teamwork. You will collaborate with team members and partners to build high-quality and innovative data platforms with full stack data solutions using latest technologies in a dynamic and agile environment and have opportunities to anticipate future technical needs of the team and provide technical leadership to keep raising the bar for our competition. We use industry-standard technology: C#, JavaScript/Typescript, HTML5, ETL/ELT, Data warehousing, and/ or Business Intelligence Development. Qualifications Basic Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. 12+ years of experience of building high scale enterprise Business Intelligence and data engineering solutions. 3+ years of management experience leading a high-performance engineering team. Proficient in designing and developing distributed systems on cloud platform. Must be able to plan work, and work to a plan adapting as necessary in a rapidly evolving environment. Experience using a variety of data stores, including data ETL/ELT, warehouses, RDBMS, in-memory caches, and document Databases. Experience using ML, anomaly detection, predictive analysis, exploratory data analysis. A strong understanding of the value of Data, data exploration and the benefits of a data-driven organizational culture. Strong communication skills and proficiency with executive communications Demonstrated ability to effectively lead and operate in cross-functional global organization Preferred Qualifications: Prior experience as an engineering site leader is a strong plus. Proven success in recruiting and scaling engineering organizations effectively. Demonstrated ability to provide technical leadership to teams, with experience managing large-scale data engineering projects. Hands-on experience working with large data sets using tools such as SQL, Databricks, PySparkSQL, Synapse, Azure Data Factory, or similar technologies. Expertise in one or more of the following areas: AI and Machine Learning. Experience with Business Intelligence or data visualization tools, particularly Power BI, is highly beneficial #BICJobs Responsibilities As a leader of the engineering team, you will be responsible for the following: Build and lead a world class data engineering team. Passionate about technology and obsessed about customer needs. Champion data-driven decisions for features identification, prioritization and delivery. Managing multiple projects, including timelines, customer interaction, feature tradeoffs, etc. Delivering on an ambitious product and services roadmap, including building new services on top of vast amount data collected by our batch and near real time data engines. Design and architect internet scale and reliable services. Leveraging machine learning(ML) models knowledge to select appropriate solutions for business objectives. Communicate effectively and build relationship with our partner teams and stakeholders. Help shape our long-term architecture and technology choices across the full client and services stack. Understand the talent needs of the team and help recruit new talent. Mentoring and growing other engineers to bring in efficiency and better productivity. Experiment with and recommend new technologies that simplify or improve the tech stack. Work to help build an inclusive working environment. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 3 days ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

At Mr. Cooper Group, You Make the Dream Possible. Our purpose is simple: Keeping the dream of homeownership alive. As a Mr. Cooper Group team member, you play a big role in making that dream possible. Around here, we know our roles and work together, volunteer to make a difference, and challenge the status quo when needed. Everything we do is in the care and service of our teammates and our customers. Join us and make the dream of home ownership possible! Key Responsibilities: Develop and implement enterprise-level data quality and governance strategies aligned with business objectives. Establish data quality KPIs, policies, and frameworks across pipelines in Azure, Databricks, SQL, Mongo, and BI tools like Power BI, Tableau, and SSRS. Build and maintain robust data quality monitoring processes and automated validation rules. Collaborate with data engineers , data scientist and architects to embed quality checks in ETL/ELT pipelines and data lakehouse structures. Lead initiatives to identify data issues, perform root cause analysis, and recommend corrective actions. Define metadata standards, data profiling methods, and data lineage documentation practices. Partner with business units to create and enforce data ownership, stewardship, and usage standards. Establish clear data SLAs and ensure ongoing compliance across reporting and analytics systems. Evaluate and implement tools and technologies to support scalable and automated data quality solutions. Support regulatory and audit needs by ensuring data traceability and compliance readiness. Preferred Qualifications: 8+ years of experience in data quality, data governance, or data management roles. Strong hands-on experience with Azure Data Services (Data Factory, Synapse, Data Lake, etc.) and Databricks. Proficient in SQL and experience working with large-scale relational and analytical data stores. Familiarity with Power BI, Tableau, and SSRS for reporting and analytics. Experience defining and measuring data quality metrics and KPIs. Proven ability to design and enforce data governance frameworks. Strong stakeholder management and communication skills with both technical and business audiences. Prior leadership or mentoring experience in building and scaling data quality functions is a plus. Nice to Have: Exposure to data cataloging tools like Purview, Collibra, or Alation. Experience working in regulated or compliance-heavy industries. Familiarity with ML-based data anomaly detection or AI-driven governance tools. Mr. Cooper Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or status as a protected veteran. EOE/M/F/D/V Job Requisition ID: 024115 Job Category: Information Technology Primary Location City: Chennai Primary Location Region: Tamil Nadu Primary Location Postal Code: 600089 Primary Location Country: India Additional Posting Location(s):

Posted 3 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description: Application Developer Bangalore, Karnataka, India AXA XL offers risk transfer and risk management solutions to clients globally. We offer worldwide capacity, flexible underwriting solutions, a wide variety of client-focused loss prevention services and a team-based account management approach. AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained advantage. What you’ll be DOING What will your essential responsibilities include? We are seeking an experienced ETL Developer to support and evolve our enterprise data integration workflows. The ideal candidate will have deep expertise in Informatica PowerCenter, strong hands-on experience with Azure Data Factory and Databricks, and a passion for building scalable, reliable ETL pipelines. This role is critical for both day-to-day operational reliability and long-term modernization of our data engineering stack in the Azure cloud. Key Responsibilities: Maintain, monitor, and troubleshoot existing Informatica PowerCenter ETL workflows to ensure operational reliability and data accuracy. Enhance and extend ETL processes to support new data sources, updated business logic, and scalability improvements. Develop and orchestrate PySpark notebooks in Azure Databricks for data transformation, cleansing, and enrichment. Configure and manage Databricks clusters for performance optimization and cost efficiency. Implement Delta Lake solutions that support ACID compliance, versioning, and time travel for reliable data lake operations. Automate data workflows using Databricks Jobs and Azure Data Factory (ADF) pipelines. Design and manage scalable ADF pipelines, including parameterized workflows and reusable integration patterns. Integrate with Azure Blob Storage and ADLS Gen2 using Spark APIs for high-performance data ingestion and output. Ensure data quality, consistency, and governance across legacy and cloud-based pipelines. Collaborate with data analysts, engineers, and business teams to deliver clean, validated data for reporting and analytics. Participate in the full Software Development Life Cycle (SDLC) from design through deployment, with an emphasis on maintainability and audit readiness. Develop maintainable and efficient ETL logic and scripts following best practices in security and performance. Troubleshoot pipeline issues across data infrastructure layers, identifying and resolving root causes to maintain reliability. Create and maintain clear documentation of technical designs, workflows, and data processing logic for long-term maintainability and knowledge sharing. Stay informed on emerging cloud and data engineering technologies to recommend improvements and drive innovation. Follow internal controls, audit protocols, and secure data handling procedures to support compliance and operational standards. Provide accurate time and effort estimates for assigned development tasks, accounting for complexity and risk. What you will BRING We’re looking for someone who has these abilities and skills: Advanced experience with Informatica PowerCenter, including mappings, workflows, session tuning, and parameterization Expertise in Azure Databricks + PySpark, including: Notebook development Cluster configuration and tuning Delta Lake (ACID, versioning, time travel) Job orchestration via Databricks Jobs or ADF Integration with Azure Blob Storage and ADLS Gen2 using Spark APIs Strong hands-on experience with Azure Data Factory: Building and managing pipelines Parameterization and dynamic datasets Notebook integration and pipeline monitoring Proficiency in SQL, PL/SQL, and scripting languages such as Python, Bash, or PowerShell Strong understanding of data warehousing, dimensional modeling, and data profiling Familiarity with Git, CI/CD pipelines, and modern DevOps practices Working knowledge of data governance, audit trails, metadata management, and compliance standards such as HIPAA and GDPR Effective problem-solving and troubleshooting skills with the ability to resolve performance bottlenecks and job failures Awareness of Azure Functions, App Services, API Management, and Application Insights Understanding of Azure Key Vault for secrets and credential management Familiarity with Spark-based big data ecosystems (e.g., Hive, Kafka) is a plus Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars: Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society – are essential to our future. We’re committed to protecting and restoring nature – from mangrove forests to the bees in our backyard – by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action : We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day – the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

15.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Sr. Cloud Solution Architect - Data & Analytics Bangalore, Karnataka, India + 2 more locations Date posted Jul 30, 2025 Job number 1854270 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Customer Success Discipline Cloud Solution Architecture Employment type Full-Time Overview With more than 45,000 employees and partners worldwide, the Customer Experience and Success (CE&S) organization is on a mission to empower customers to accelerate business value through differentiated customer experiences that leverage Microsoft’s products and services, ignited by our people and culture. We drive cross-company alignment and execution, ensuring that we consistently exceed customers’ expectations in every interaction, whether in-product, digital, or human-centered. CE&S is responsible for all up services across the company, including consulting, customer success, and support across Microsoft’s portfolio of solutions and products. Join CE&S and help us accelerate AI transformation for our customers and the world. We’re looking for a Cloud Solution Architect (CSA) who specializes in data platforms and analytics to help customers build secure, scalable, and AI-ready solutions on Microsoft Azure. In this customer-facing role, you’ll deliver engagements that span architecture design, proof of concepts, and production deployments, ensuring performance, resiliency, and security across mission-critical workloads. As part of the Cloud + AI Data team, you’ll leverage existing Repeatable IP and execution engines with accountability to drive delivery excellence, accelerate adoption, and ensure a successful deployment for the customer for services like Microsoft Fabric, Azure Databricks, Cosmos DB, and Purview. You’ll collaborate with engineering teams, share best practices, and stay current with evolving trends to help customers unlock the full value of their data and AI investments. This role is flexible in that you can work up to 50% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Bachelor's Degree in Computer Science, Information Technology, Engineering, Business, or related field AND 15+ years experience in cloud/infrastructure technologies, information technology (IT) consulting/support, systems administration, network operations, software development/support, technology solutions, practice development, architecture, and/or consulting OR Master's Degree in Computer Science, Information Technology, Engineering, Business, or related field AND 15+ years experience in cloud/infrastructure technologies, technology solutions, practice development, architecture, and/or consulting OR equivalent experience 15+ years experience working in a customer-facing role (e.g., internal and/or external) and on technical projects Technical Certification in Cloud (e.g., Azure, Amazon Web Services, Google, security certifications) or Industry Certifications such as TOGAF etc. Responsibilities Engage with customer IT and business leaders to understand their data estate, priorities, and success measures, and design secure, scalable Data & AI solutions that deliver measurable business value. Lead architecture design sessions, develop Data & Analytics roadmaps, and drive Proof of Concepts (POCs) and Minimum Viable Products (MVPs) to accelerate adoption and ensure long-term technical viability. Own the end-to-end technical delivery results, ensuring completeness and accuracy of consumption and customer success plans in collaboration with the CSAM. Deliver repeatable intellectual property (IP) to achieve targeted outcomes, accelerate Azure Consumed Revenue (ACR), and contribute to centralized IP development initiatives. Provide delivery oversight and escalation support for key Factory engagements across Data & Analytics workloads. Drive technical excellence by leading the health, resiliency, security, and optimization of mission-critical data workloads, ensuring readiness for production-scale AI use cases. Identify and resolve technical blockers, share customer feedback with engineering teams, and influence product improvements through Voice of the Customer insights. Maintain deep technical expertise and stay current with market trends, competitive insights, and Microsoft’s evolving data and AI capabilities. Be accredited and certified to deliver with advanced and expert-level proficiency in priority workloads including Microsoft Fabric, Azure Databricks, Microsoft Purview, Azure SQL, PostgreSQL, MySQL, and Cosmos DB. Demonstrate a growth mindset by continuously aligning your skills to team and customer needs, contributing to technical communities, and mentoring others to accelerate customer outcomes. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

The role is seeking a dynamic individual to join the M&R Sales Tech team, bringing expertise in software development of ETL and ELT jobs for the data warehouse software development team. This position plays a crucial role in defining the Design and Architecture during the migration from legacy SSIS technology to cutting-edge cloud technologies such as Azure, Databricks, and Snowflake. The ideal candidate will possess a robust background in Software Architecture, data engineering, and cloud technologies. Key Responsibilities: Architectural Design: Design and implement data architectures of ETL, including creating algorithms, developing data models and schemas, and setting up data pipelines. Technical Leadership: Provide technical leadership to the software development team to ensure alignment of data solutions with business objectives and overall IT strategy. Data Strategy and Management: Define data strategy and oversee data management within the organization, focusing on data governance, quality, privacy, and security using Databricks and Snowflake technologies. Implementation of Machine Learning Models: Utilize Databricks for implementing machine learning models, conducting data analysis, and deriving insights. Data Migration and Integration: Transfer data from on-premise or other cloud platforms to Snowflake, integrating Snowflake and Databricks with other systems for seamless data flow. Performance Tuning: Optimize database performance by fine-tuning queries, enhancing processing speed, and improving data storage and retrieval mechanisms. Troubleshooting and Problem Solving: Identify and resolve issues related to Database, data migration, data pipelines, and other ETL processes, addressing concerns like data quality, system performance, and data security. Stakeholder Communication: Effectively communicate with stakeholders to grasp requirements and deliver solutions that meet business needs. Requirement Qualifications: Education: Bachelor's degree in Computer Science, Engineering, or related field, or equivalent experience. Experience: Minimum of 8 years of experience in software development and Architecture role. Technical Skills: Proficiency in ETL/ELT processes and tools, particularly SSIS; 5+ years of experience with large data warehousing applications; solid experience with reporting tools like Power BI and Tableau; familiarity with creating batch and real-time jobs with Databricks and Snowflake, and working with streaming platforms like Kafka and Airflow. Soft Skills: Strong leadership and team management skills, problem-solving abilities, and effective communication and interpersonal skills. Preferred Qualifications: Experience with Agile development methodologies. Certification in relevant cloud technologies (e.g., Azure, Databricks, Snowflake). Primary Skills: Azure, Snowflake, Databricks Secondary Skills: SSIS, Power BI, Tableau Role Purpose: The purpose of the role is to create exceptional architectural solution design and thought leadership, enabling delivery teams to provide exceptional client engagement and satisfaction. Key Roles and Responsibilities: Develop architectural solutions for new deals/major change requests, ensuring scalability, reliability, and manageability of systems. Provide solutioning of RFPs from clients, ensuring overall design assurance. Manage the portfolio of to-be-solutions to align with business outcomes, analyzing technology environment, client requirements, and enterprise specifics. Offer technical leadership in designing, developing, and implementing custom solutions using modern technology. Define current and target state solutions, articulate architectural targets, recommendations, and propose investment roadmaps. Evaluate and recommend solutions for integration with the technology ecosystem. Collaborate with IT groups to ensure task transition, performance, and issue resolution. Enable Delivery Teams by providing optimal delivery solutions, building relationships with stakeholders, and developing relevant metrics to drive results. Manage multiple projects, identify risks, ensure quality assurance, and recommend tools for reuse and automation. Support pre-sales teams in presenting solution designs to clients, negotiate requirements, and demonstrate thought leadership. Competency Building and Branding: Develop PoCs, case studies, and white papers, attain market recognition, and mentor team members for career development. Team Management: Resourcing, Talent Management, Performance Management, Employee Satisfaction and Engagement. Join us at Wipro, a business driven by purpose and reinvention, where your ambitions can be realized through constant evolution and empowerment. Applications from individuals with disabilities are encouraged.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

vijayawada, andhra pradesh

On-site

You have a great opportunity as a Power BI Developer in Vijayawada. With over 3 years of experience, you will be responsible for developing advanced dashboards using Power BI. Your expertise in data modeling, design standards, tools, and best practices will be crucial for creating enterprise data models. You should have excellent knowledge of Power BI Desktop, Charting Widgets, and connecting to various data sources. Your role will involve building Power BI reports by leveraging DAX functions. Knowledge of writing SQL statements using MS SQL Server is essential, and experience with ETL, SSAS, and SSIS will be a plus. Familiarity with Power BI Mobile is desired. Having experience with SQL Server or Postgres is a must for this position. Azure experience will be beneficial, and familiarity with Power BI Synapse, Snowflake, Azure Data Lake, and Databricks connectors is an added advantage. This role offers you the opportunity to work with cutting-edge technologies and make a significant impact in the field of data analytics.,

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies