Jobs
Interviews

4894 Data Processing Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

3 - 6 Lacs

hyderabad

Work from Office

About the Role: We are seeking a highly skilled and experienced Machine Learning Engineer to join our dynamic team. As a Machine Learning Engineer, you will be responsible for the design, development, deployment, and maintenance of machine learning models and systems that drive our [mention specific business area or product, e.g., recommendation engine, fraud detection system, autonomous vehicles]. You will work closely with data scientists, software engineers, and product managers to translate business needs into scalable and reliable machine learning solutions. This is a key role in shaping the future of CBRE and requires a strong technical foundation combined with a passion for innovation and problem-solving. Responsibilities: Model Development & Deployment: Design, develop, and deploy machine learning models using various algorithms (e.g., regression, classification, clustering, deep learning) to solve complex business problems. Select appropriate datasets and features for model training, ensuring data quality and integrity. Implement and optimize model training pipelines, including data preprocessing, feature engineering, model selection, and hyperparameter tuning. Deploy models to production environments using containerization technologies (e.g.,Docker, Kubernetes) and cloud platforms (e.g., AWS, GCP, Azure). Monitor model performance in production, identify and troubleshoot issues, and implement model retraining and updates as needed. Infrastructure & Engineering: Develop and maintain APIs for model serving and integration with other systems. Write clean, well-documented, and testable code. Collaborate with software engineers to integrate models into existing products and services. Research & Innovation : Stay up to date with the latest advancements in machine learning and related technologies. Research and evaluate new algorithms, tools, and techniques to improve model performance and efficiency. Contribute to the development of new machine learning solutions and features. Proactively identify opportunities to leverage machine learning to solve business challenges. Collaboration & Communication: Collaborate effectively with data scientists, software engineers, product managers, and other stakeholders. Communicate technical concepts and findings clearly and concisely to both technical and non-technical audiences. Participate in code reviews and contribute to the teams knowledge sharing. Qualifications: Experience : 7+ years of experience in machine learning engineering or a related field. Technical Skills: Programming Languages : Proficient in Python and experience with other languages (e.g., Java, Scala, R) is a plus. Machine Learning Libraries : Strong experience with machine learning libraries and frameworks such as scikit-learn, TensorFlow, PyTorch, Keras, etc. Data Processing : Experience with data manipulation and processing using libraries like Pandas, NumPy, and Spark. Model Deployment : Experience with model deployment frameworks and platforms (e.g., TensorFlow Serving, TorchServe, Seldon, AWS SageMaker, Google AI Platform, Azure Machine Learning). Databases : Experience with relational and NoSQL databases (e.g., SQL, MongoDB, Cassandra). Version Control : Experience with Git and other version control systems. DevOps : Familiarity with DevOps practices and tools. Strong understanding of machine learning concepts and algorithms : Regression, Classification, Clustering, Deep Learning etc. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills.

Posted 1 week ago

Apply

2.0 - 7.0 years

8 - 9 Lacs

bengaluru

Work from Office

Azure Data Engineer- Azure Data Engineer Position Overview We are seeking a talented Azure Data Engineer to join our dynamic team. This role involves designing, building, and managing data solutions using the Azure ecosystem. The ideal candidate will work on data integration, transformation, and visualization while ensuring high-quality, secure, and scalable data pipelines. Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory . Manage and optimize data storage using Azure Data Lake Gen 2 . Build and process large-scale data solutions with Azure Databricks and Apache Spark . Create interactive reports and dashboards in Power BI for business insights. Collaborate with cross-functional IT and business teams to translate requirements into data solutions. Ensure data quality, security, and compliance in line with organizational standards. Core Skills Required Azure Data Factory : Experience in building and managing data pipelines. Azure Data Lake Gen 2 : Proficient in data storage and management within Azures data lake environment. Azure Databricks / Apache Spark : Hands-on skill with distributed data processing, transformations, and analytics. Power BI : Expertise in data visualization and reporting. Nice-to-Have Skills Basic SQL Performance Tuning : Ability to write and optimize SQL queries. Data Governance & Unity Catalog : Understanding of data governance principles and experience with Unity Catalog for data management. Certification : Microsoft DP-203 (Azure Data Engineer Associate) certification. CI/CD Pipelines : Experience implementing CI/CD pipelines for Azure Data Factory or Databricks projects. Qualifications Bachelors degree in Computer Science, Information Technology, or related field (or equivalent experience). 2+ years of professional experience with Azure data services. Strong analytical and problem-solving skills. Effective communication and team collaboration abilities. Preferred Traits Continuous learner, staying updated with Azure and data engineering advancements. Experience working in Agile environments. Passion for optimizing data workflows and enabling data-driven decisions. If you are enthusiastic about working with advanced Azure data services and enabling impactful analytics, we encourage you to apply!

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

pune

Work from Office

BTB Be The Bank, is looking for Full Stack Engineer to join our dynamic team and embark on a rewarding career journey Meeting with the software development team to define the scope and scale of software projects Designing software system architecture Completing data structures and design patterns Designing and implementing scalable web services, applications, and APIs Developing and maintaining internal software tools Writing low-level and high-level code Troubleshooting and bug fixing Identifying bottlenecks and improving software efficiency Collaborating with the design team on developing micro-services Writing technical documents Good communication skills High-level project management skills

Posted 1 week ago

Apply

2.0 - 4.0 years

13 - 14 Lacs

hyderabad

Work from Office

FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients needs and exceeding their expectations. Your Teams Impact Our department focuses on building and supporting solutions that enhance productivity and collaboration across the organization. We primarily work on ticketing systems helping teams efficiently track, manage, and resolve issues and project management projects , enabling streamlined planning, execution, and monitoring of initiatives. By providing reliable tools and processes, we ensure that teams can collaborate effectively, stay organized, and deliver results with greater transparency and efficiency. What You ll Do: Deliver technical effort estimates to the analytics team and other business stakeholders when planning new feature and updating existing implementations. Partner with other internal teams to design/improve efficiency and optimize current processes. Consistently engage and address client needs while serving as a primary point of contact. Use C# , Angular and SQL to design and implement scalable data processing solutions. Experience in the design, development, and code review of SQL and C# and Angular and Jquery scripts Demonstratable technical expertise with the following: Databases: SQL Server, PostgreSQL Scripting and exploration: C#, SQL Work with cloud platforms to deploy and maintain (AWS). What We re Looking For: Required Skills Bachelors degree in engineering with specialization in Computer Science, IT, or Electronics. 2- 4 years of relevant experience in .Net Technologies Object oriented languages such as C#, C++. Scripting languages like Python, Perl, etc. on Windows or Linux platform. Platform as a Service (Paas) and cloud Technologies like AWS, Azure. Relational Database technologies such as SQL Server, PostgreSQL. UI technologies as AngularJS. JavaScript libraries and superscripts such as jQuery, Bootstrap, ES6, Typescript, etc. Continuous Integration Tools like GIT Actions, Source Code repositories like GIT and TFS Whats In It for You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging.

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 7 Lacs

mumbai

Work from Office

About Exponentia.ai Exponentia.ai is a fast-growing AI-first technology services company, partnering with enterprises to shape and accelerate their journey to AI maturity. With a presence across the US, UK, UAE, India, and Singapore , we bring together deep domain knowledge, cloud-scale engineering, and cutting-edge artificial intelligence to help our clients transform into agile, insight-driven organizations. We are proud partners with global technology leaders such as Databricks, Microsoft, AWS, and Qlik , and have been consistently recognized for innovation, delivery excellence, and trusted advisories. Awards & Recognitions: Innovation Partner of the Year Databricks, 2024 Digital Impact Award, UK 2024 (TMT Sector) Rising Star APJ Databricks Partner Awards 2023 Qlik s Most Enabled Partner APAC With a team of 450+ AI engineers, data scientists, and consultants , we are on a mission to redefine how work is done, by combining human intelligence with AI agents to deliver exponential outcomes. Learn more: www.exponentia.ai About Exponentia.ai Exponentia.ai is a fast-growing AI-first technology services company, partnering with enterprises to shape and accelerate their journey to AI maturity. With a presence across the US, UK, UAE, India, and Singapore , we bring together deep domain knowledge, cloud-scale engineering, and cutting-edge artificial intelligence to help our clients transform into agile, insight-driven organizations. We are proud partners with global technology leaders such as Databricks, Microsoft, AWS, and Qlik , and have been consistently recognized for innovation, delivery excellence, and trusted advisories. Awards & Recognitions: Innovation Partner of the Year Databricks, 2024 Digital Impact Award, UK 2024 (TMT Sector) Rising Star APJ Databricks Partner Awards 2023 Qlik s Most Enabled Partner APAC With a team of 450+ AI engineers, data scientists, and consultants , we are on a mission to redefine how work is done, by combining human intelligence with AI agents to deliver exponential outcomes. Learn more: www.exponentia.ai About the Role: We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Ideal Candidate Profile: 3+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines

Posted 1 week ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

pune

Work from Office

Job Details The Lead Data Engineer (DE Lead) plays a pivotal role in managing and fulfilling structured data requests in alignment with organizational policies and regulatory standards. As the primary technical authority, the DE Lead is responsible for extracting, organizing, and securely delivering accurate data, while collaborating closely with Business Analysts to ensure timely and compliant data processing. Core responsibilities include data extraction and reporting using tools such as SQL , SAP BusinessObjects , and Power BI , developing interactive dashboards, maintaining data integrity, addressing database vulnerabilities, and ensuring strict adherence to privacy standards concerning Personally Identifiable Information (PII) and Protected Health Information (PHI) . . Work Experience: 8 years of overall professional experience 6+ years in data management Preferably with direct experience in healthcare data analytics or data engineering. Strong expertise in SQL Proficient in Power BI Skilled in Python Advanced capabilities in Microsoft Excel Experience with SAP BusinessObjects Data Extraction & Reporting using SQL, SAP BusinessObjects, and Power BI Dashboard Development : Creating interactive and insightful dashboards Data Integrity & Security : Maintaining data accuracy and resolving database vulnerabilities Education: Bachelor s degree in computer science , Information Technology or any other related discipline and equivalent related experience. Preferred Certifications: Advanced Data Analytics Certifications AI and ML Certifications SAS Statistical Business Analyst Professional Certification Skills & Knowledge: Behavioral Skills: Conflict Resolution Creativity & Innovation Decision Making Planning Presentation Skills Risk-taking Technical Skills: Advanced Data Visualization Techniques Advanced Statistical Analysis Big Data Analysis Tools and Techniques Data Governance Data Management Data Modelling Data Quality Assurance Machine Learning and AI Fundamentals Programming languages like SQL, R, Python Tools Knowledge: Business Intelligence Software like Tableau, Power BI, Alteryx, QlikSense Data Visualization Tools Microsoft Office Suite Statistical Analytics tools (SAS, SPSS3)

Posted 1 week ago

Apply

7.0 - 12.0 years

8 - 13 Lacs

bengaluru

Work from Office

Lead end-to-end machine learning projects, from data exploration, modeling, and deployment, ensuring alignment with business objectives. Utilize traditional AI/data science methods (e.g., regression, classification, clustering) and advanced AI methods (e.g., neural networks, NLP) to address business problems and optimize processes. Implement and experiment with Generative AI models based on business needs using Prompt Engineering, Retrieval Augmented Generation (RAG) or Finetuning, using LLM\u0027s, LVM\u0027s, TTS etc. Collaborate with teams across Digital & Innovation, business stakeholders, software engineers, and product teams, to rapidly prototype and iterate on new models and solutions. Mentor and coach junior data scientists and analysts, fostering an environment of continuous learning and collaboration. Adapt quickly to new AI advancements and technologies, continuously learning and applying emerging methodologies to solve complex problems. Work closely with other teams (e.g., Cybersecurity, Cloud Engineering) to ensure the successful integration of models into production systems. Ensure models meet rigorous performance, accuracy, and efficiency standards, performing cross-validation, tuning, and statistical checks. Communicate results and insights effectively to both technical and non-technical stakeholders, delivering clear recommendations for business impact. Ensure adherence to data privacy, security policies, and governance standards across all data science initiatives. Your Profile: Bachelor\u0027s degree in Data Science, Machine Learning, Computer Science, Statistics, or a related field. Master s degree or Ph.D. is a plus. 7+ years of experience in data science, machine learning, or AI, with demonstrated success in building models that drive business outcomes. Proficient in Python, R, and SQL for data analysis, modeling, and data pipeline development. Experience with DevSecOps practices, and tools such as GitHub, Azure DevOps, Terraform, Bicep, AquaSec etc. Experience with cloud platforms (Azure, AWS, Google Cloud) and large-scale data processing tools (e.g., Hadoop, Spark). Strong understanding of both supervised and unsupervised learning models and techniques. Experience with frameworks like TensorFlow, PyTorch, and working knowledge of Generative AI models like GPT and GANs Hands-on experience with Generative AI techniques, but with a balanced approach to leveraging them where they can add value. Proven experience in rapid prototyping and ability to iterate quickly to meet business needs in a dynamic environment

Posted 1 week ago

Apply

5.0 - 10.0 years

11 - 12 Lacs

pune

Work from Office

A seasoned, experienced professional with a full understanding of area of specialization; resolves a wide range of issues in creative ways. This job is the fully qualified, career-oriented, journey-level position. Technical Should understand industry terms and competitive landscape Previous experience in competitive opportunities with positive outcomes Dedicated to advanced trainings, updates and providing mentorship (SE Summit, Spark, SKO, Other) Should have Industry certifications (Cloud, DB, or adjacent technologies) Can act autonomously in an architectural conversation where Cohesity products are the center. Business/Sales Acumen Ability to explain technical concepts to a non-technical audience. Partner and their voice to suggest improvements in a larger sales team whilst executing initiatives Demonstrated adaptability based on previous experience Ability to execute persuasive, technical presentation to convince an audience Ability to territory and account plan as part of a sales team Demonstrated relationships in the regional channel community Other soft skill Adaptable, demonstrating openness to new organization structures, procedures, strategies and ideas and is willing to try new ways of working Good public speaking and able to present succinct messages Demonstrated personal soft development Team player who responds well to requests for assistance, securing additional resources as needed Customer-centric and able to pre-empt some issues before they occur, flagging them to senior leadership May be working towards certification(s)

Posted 1 week ago

Apply

8.0 - 13.0 years

7 - 10 Lacs

bengaluru

Work from Office

About KnowBe4 KnowBe4, the provider of the worlds largest security awareness training and simulated phishing platform, is used by tens of thousands of organizations around the globe. KnowBe4 enables organizations to manage the ongoing problem of social engineering by helping them train employees to make smarter security decisions, every day. Fortune has ranked us as a best place to work for women, for millennials, and in technology for four years in a row! We have been certified as a "Great Place To Work" in 8 countries, plus weve earned numerous other prestigious awards, including Glassdoors Best Places To Work. Our team values radical transparency, extreme ownership, and continuous professional development in a welcoming workplace that encourages all employees to be themselves. Whether working remotely or in-person, we strive to make every day fun and engaging; from team lunches to trivia competitions to local outings, there is always something exciting happening at KnowBe4. Please submit your resume in English. This individual in this role is responsible for developing new and exciting products for KnowBe4 s customers, alongside other engineers in a fast-paced, agile development environment. Responsibilities: Develops software using the KnowBe4 Software Development Lifecycle and Agile Methodologies Recommends solutions to engineering problems Assist other team members by providing technical direction Defines approaches and solutions to complex technical problems Helps to translate KnowBe4s strategic goals into operational plans Provides coordination across functional boundaries May act as team lead for sub-projects Requirements: BS or equivalent plus 8 years experience MS or equivalent plus 3 years experience Ph.D. or equivalent plus 2 years experience Training in secure coding practices (preferred) Should have extensive experience with building and integrating REST-based APIs with best practices of authentication & authorization in enterprise-grade production environments. Experience with building apps and microservices on the AWS platform using Python Expert knowledge in at least one of the web framework technologies like Python Django/Flask/Rails. Understanding and experience in building software systems following software design principles. Demonstrable knowledge of fundamental cloud concepts around multi-tenancy, scaling out, and serverless. Working experience in writing clean, unit-tested, and secure code. Working knowledge in relational databases such as MYSQL/POSTGRES and expertise in SQL. Knowledge of no-SQL databases such as Mongo and Elasticsearch is preferred. Experience with continuous delivery and integration pipelines: Docker/Gitlab/Terraform and other Automated deployment and testing tools. Should be open to learning new technologies & programming languages as and when needed. Experience in working with APIs in the cybersecurity industry, and understanding the basics of the current security landscape (attack frameworks, security log processing, basic knowledge of AV/EDR/DLP/CASB, etc.) is a plus. Experience building scalable data processing pipelines is a plus. Our Fantastic Benefits We offer company-wide bonuses based on monthly sales targets, employee referral bonuses, adoption assistance, tuition reimbursement, certification reimbursement, certification completion bonuses, and a relaxed dress code - all in a modern, high-tech, and fun work environment. For more details about our benefits in each office location, please visit www.knowbe4.com / careers / benefits . Note: An applicant assessment and background check may be part of your hiring procedure. Individuals seeking employment at KnowBe4 are considered without prejudice to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, sexual orientation or any other characteristic protected under applicable federal, state, or local law. If you require reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please visit www.knowbe4.com / careers / request-accommodation. No recruitment agencies, please.

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 13 Lacs

bengaluru

Work from Office

About KnowBe4 KnowBe4, the provider of the worlds largest security awareness training and simulated phishing platform, is used by tens of thousands of organizations around the globe. KnowBe4 enables organizations to manage the ongoing problem of social engineering by helping them train employees to make smarter security decisions, every day. Fortune has ranked us as a best place to work for women, for millennials, and in technology for four years in a row! We have been certified as a "Great Place To Work" in 8 countries, plus weve earned numerous other prestigious awards, including Glassdoors Best Places To Work. Our team values radical transparency, extreme ownership, and continuous professional development in a welcoming workplace that encourages all employees to be themselves. Whether working remotely or in-person, we strive to make every day fun and engaging; from team lunches to trivia competitions to local outings, there is always something exciting happening at KnowBe4. Please submit your resume in English. The individual in this role is responsible for leading software development teams to develop new and exciting products for KnowBe4 s customers, alongside other engineers in a fast-paced, agile development environment. Responsibilities: Leads a software team that develops software using the KnowBe4 Software Development Lifecycle and Agile Methodologies Recommends solutions to engineering problems Provide genuine recommendations as to the hiring, firing, promotion, and discipline of subordinate employees to which the Company gives significant weight Requirements: BS or equivalent plus 8 years experience MS or equivalent plus 3 years experience Build, Manage and deliver high quality software product and features Ability to manage team of highly talented software engineers Should have extensive experience with building and integrating REST-based APIs with best practices of authentication & authorization in enterprise-grade production environments. Experience with building apps and microservices on the AWS platform using Python Expert knowledge in at least one of the web framework technologies like Python Django/Flask/Rails/Express. Understanding and experience in building software systems following software design principles. Demonstrable knowledge of fundamental cloud concepts around multi-tenancy, scaling out, and serverless. Working experience in writing clean, unit-tested, and secure code. Working knowledge in relational databases such as MYSQL/POSTGRES and expertise in SQL. Knowledge of no-SQL databases such as Mongo and Elasticsearch is preferred. Experience with continuous delivery and integration pipelines: Docker/Gitlab/Terraform and other Automated deployment and testing tools. Should be open to learning new technologies & programming languages as and when needed. Experience in working with APIs in the cybersecurity industry, and understanding the basics of the current security landscape (attack frameworks, security log processing, basic knowledge of AV/EDR/DLP/CASB, etc.) is a huge plus. Experience building scalable data processing pipelines is a plus. Our Fantastic Benefits We offer company-wide bonuses based on monthly sales targets, employee referral bonuses, adoption assistance, tuition reimbursement, certification reimbursement, certification completion bonuses, and a relaxed dress code - all in a modern, high-tech, and fun work environment. For more details about our benefits in each office location, please visit www.knowbe4.com / careers / benefits . Note: An applicant assessment and background check may be part of your hiring procedure. Individuals seeking employment at KnowBe4 are considered without prejudice to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, sexual orientation or any other characteristic protected under applicable federal, state, or local law. If you require reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please visit www.knowbe4.com / careers / request-accommodation. No recruitment agencies, please.

Posted 1 week ago

Apply

7.0 - 8.0 years

22 - 27 Lacs

hyderabad

Work from Office

Overview Execute Business Insights & Analytics responsibilities (for PepsiCo Europe Beverages Sector team) as part of the broader Global Business Services function in Hyderabad, India. This role will help to enable accelerated growth for PepsiCo by contributing to the Europe Beverages Sector team while also working alongside the consumer marketing team to provide an integrated holistic overview to the business. Primary responsibilities include creating/updating existing dashboards, Excel/Power BI reports, delivering periodic and on-demand brand reporting, and addressing ad-hoc requests based on internal and external data sources. The role will have short-term responsibilities for knowledge transfer from the business and flawless delivery of recurring reports. Once established, the role will execute optimization of the data-based Insights & Analytics processes, including ad hoc questions and overall automation of delivery where applicable.. Responsibilities Build Strong Business Insights & Analytics Execute market, portfolio, brand & promotion campaign performance reporting (utilizing dashboards, templated decks, and reporting tools) Analyze & Report category, brand & promotion performance drivers, and optimization opportunities Bring impactful insights for the BU by integrating & leveraging multiple data sources such as Internal Sales, Agency (RMS, HHP etc) Translate complex data findings into actionable insights and strategic recommendations for decision-making. Assist the team in analysing marketing expenses & budgets for better utilization of marketing investments Manage Ad-hoc & follow up deep-dives into the Data to address tactical performance issues & challenges Collaborate with stakehokders to develop analysis and reports offering strategic plans. Build strong Data Processing & Automation Integrate & Optimize Data sets & Reporting system to manage heavy data processing for routine reporting Explore Automation opportunities with Higher focus on developing significant Insights for the Marketing Teams Speed up the Business Intelligence & Insights for timely & impactful decision making Help on implementing and automating Pan Europe Quarterly Business Reviews Implement innovative solutions to enhance data analysis capabilities and efficiency. Qualifications 7-8 years of experience in Analytics with exposure to Global Fortune 500 FMCG companies Ability to work and think independently Good analytics and insights experience - end-to-end understanding of the best research approach Can synthesize multiple, disparate data sources into compelling growth strategies. Formulates a strong POV and can articulate future scenarios and is an exceptional story-teller. Strong collaborator; Interested and motivated by working with others. Actively creates and participates in opportunities to co-create solutions across markets or brands; will be willing and able to embrace Responsive Ways of Working Proven analytics, data research experience, consumer insights experience or commercial experience in combination with strong analytical skills Good degree of familiarity with CPG and Food & Beverage industry data sources, including Nielsen (POS and HH panel), Kantar Worldpanel Deep understanding of FMCG industry business performance outputs and causal measures, their relationships, and how to bring business performance insights to life visually Proficient with PowerPoint and Advanced Excel; including ability to write complex formulas Ability to create macros and dashboards in Excel Good to have Experience: PowerBI and statistical analysis tool(s) Operational experience from business servicing sector and/or consulting experience would be a plus Fluent English communication skills Excellent communication skills, confident and credible with senior stakeholders Strong story-telling and presentation skills to turn data into impactful insight and brand strategy that can drive the business forward 7-8 years of experience in Analytics with exposure to Global Fortune 500 FMCG companies Ability to work and think independently Good analytics and insights experience - end-to-end understanding of the best research approach Can synthesize multiple, disparate data sources into compelling growth strategies. Formulates a strong POV and can articulate future scenarios and is an exceptional story-teller. Strong collaborator; Interested and motivated by working with others. Actively creates and participates in opportunities to co-create solutions across markets or brands; will be willing and able to embrace Responsive Ways of Working Proven analytics, data research experience, consumer insights experience or commercial experience in combination with strong analytical skills Good degree of familiarity with CPG and Food & Beverage industry data sources, including Nielsen (POS and HH panel), Kantar Worldpanel Deep understanding of FMCG industry business performance outputs and causal measures, their relationships, and how to bring business performance insights to life visually Proficient with PowerPoint and Advanced Excel; including ability to write complex formulas Ability to create macros and dashboards in Excel Good to have Experience: PowerBI and statistical analysis tool(s) Operational experience from business servicing sector and/or consulting experience would be a plus Fluent English communication skills Excellent communication skills, confident and credible with senior stakeholders Strong story-telling and presentation skills to turn data into impactful insight and brand strategy that can drive the business forward Build Strong Business Insights & Analytics Execute market, portfolio, brand & promotion campaign performance reporting (utilizing dashboards, templated decks, and reporting tools) Analyze & Report category, brand & promotion performance drivers, and optimization opportunities Bring impactful insights for the BU by integrating & leveraging multiple data sources such as Internal Sales, Agency (RMS, HHP etc) Translate complex data findings into actionable insights and strategic recommendations for decision-making. Assist the team in analysing marketing expenses & budgets for better utilization of marketing investments Manage Ad-hoc & follow up deep-dives into the Data to address tactical performance issues & challenges Collaborate with stakehokders to develop analysis and reports offering strategic plans. Build strong Data Processing & Automation Integrate & Optimize Data sets & Reporting system to manage heavy data processing for routine reporting Explore Automation opportunities with Higher focus on developing significant Insights for the Marketing Teams Speed up the Business Intelligence & Insights for timely & impactful decision making Help on implementing and automating Pan Europe Quarterly Business Reviews Implement innovative solutions to enhance data analysis capabilities and efficiency.

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

noida

Work from Office

Architect and manage Databricks workspaces, clusters, and jobs for scalable data processing and ML workloads. Implement secure data lakehouse architectures with Delta Lake, Unity Catalog, and workspace isolation. Optimize performance and cost-efficiency of Databricks deployments across cloud providers. Implement centralized logging, monitoring, and alerting for Databricks and cloud resources. Ensure auditability and traceability of data access and transformations. Support governance frameworks with metadata management and data cataloging. Mandatory Competencies Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - GCP - Cloud Functions Cloud - Azure - Azure Devops, Azure Pipelines, Azure CLI Beh - Communication and collaboration Cloud - AWS - Amazon IAM, AWS Secrets Manager, AWS KMS, AWS Cognito

Posted 1 week ago

Apply

8.0 - 12.0 years

25 - 35 Lacs

noida, hyderabad

Work from Office

R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services that transform and solve challenges across health systems, hospitals, and physician practices. Headquartered in Chicago, R1 is a publicly traded organization with employees throughout the US and other international locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients, our employees, and the communities we operate in. With our proven and scalable operating model, we complement a healthcare organization s infrastructure. quickly driving sustainable improvements to net patient revenue and cash flows. while reducing operating costs and enhancing the patient experience. Our approach to building software is disciplined and quality-focused with an emphasis on creativity, craftsmanship and commitment. We are looking for smart, quality-minded individuals who want to be a part of a high functioning, dynamic global team. Position Summary A PAM Engineer plays a critical role in managing and configuring R1 s Privileged Access Management (PAM) platform and workflows. As a subject matter expert on PAM systems and workflows, a PAM Engineer ensures continued function and inter-operability between PAM systems and identity providers within the environment. Additionally, the PAM Engineer plans, coordinates, and executes new PAM implementations throughout the organization with both organic and non-organic growth. Essential Responsibilities Interface with the organization to manage intake projects for new PAM implementations, including adding new environments or domains, adding controls to existing resources, and expanding the scope of users within PAM controls. Document all new processes, including developing SOPs for UAA and product documentation for the IAM Platform team, continually ensuring accurate and up-to-date information is available for all target audiences. Managing configuration of PAM systems and monitoring their usages and effectiveness. Coordinate with vendors as needed to expedite solutions, issues, or projects. Providing Tier 1 Tier 2 troubleshooting for issues with PAM Systems and workflows. Continually analyze data from multiple sources to identify efficiencies or issues with data, workflows, and processes. Interface with the UAA team to transfer knowledge and transition new workflows. Interface with IAM Platform team to communicate/translate business needs into actions for the Platform Team to manage. Skills Excellent communication skills, both oral and written, with the ability to communicate effectively to customers, peers, and organizational leaders especially in communicating technical concepts. Demonstrated logical thought processes must have the ability to quickly learn new technologies, systems, concepts and procedures, and ability to utilize reports and data to improve operational results. Experience in merging normalized and non-normalized data from multiple sources for analytical and targeted review. Experience in configuring and installing vault technologies such as Delinea secret server, Thycotic Experience with Privileged Access Management concepts, controls, and best practices. Knowledge of data processing systems, concepts, and methodologies. Highly motivated and able to work autonomously as well as a part of a team. Other Qualifications Proficient computer skills (including, but not limited to, spreadsheets, Internet, and email) are required. Associate or bachelor s Degree or equivalent professional experience preferred. Strong knowledge of Windows/Linux administration. Understanding incident and change management process Demonstrated understanding of scripting and automation with PowerShell/Python a plus

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

noida

Work from Office

Data Services Architect to work and lead the design and implementation of data architecture solutions for a logistics enterprise. This role requires a good understanding of canonical architecture patterns, medallion (bronze-silver-gold) data architecture, and end-to-end data processing pipelines and enabling analytics-ready data from raw files stored in AWS S3, using Databricks as the core processing platform. The ideal candidate will collaborate closely with business stakeholders to understand domain knowledge. Key Responsibilities: Canonical Architecture Design Medallion Architecture Implementation (Databricks) Raw Data Mapping and Transformation : Build ingestion pipelines to process raw files from AWS S3 into Bronze layer Map, and transform raw data into structured canonical formats aligned with logistics business rules Implement scalable DevOps pipelines using PySpark, Delta Lake , and Databricks Workflows . Business Engagement: Work closely with business SMEs, operations teams, and product managers to understand logistics processes, business entities, and KPIs. Understand the business requirements into data models and semantic layers. Collaboration Leadership: Guide data engineers in the development and maintenance of data pipelines. Provide architectural oversight and best practices for data processing and integration. Mandatory Competencies Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Data Science and Machine Learning - Data Science and Machine Learning - Databricks Big Data - Big Data - Pyspark DevOps/Configuration Mgmt - Cloud Platforms - AWS Beh - Communication and collaboration

Posted 1 week ago

Apply

2.0 - 4.0 years

7 - 8 Lacs

pune

Work from Office

Position Staff - Data Engineer Staff Designing and developing software components using various tools viz., PySpark, Sqoop, Flume, Azure Databricks and etc. Perform detailed analysis and effectively interact with the onshore/offshore team members. Ensure all deliverables conform to the highest quality standards and are executed in a timely manner. Work independently with minimum supervision. The role is deadline oriented and may require working under US time schedule. Identify areas of improvement and bring in a change to streamline the work environment. Conducting performance tests. Consulting with the design team. Ensuring high performance of applications and providing support. Team player and works well with development/ product engineering teams Problem solver and good at troubleshooting complex problems to find the root cause and provide solution. Passionate developer who is committed to deliver high-quality solutions/products. Position Requirements - Staff 2 4 years of experience in BCM or WAM industry, Exposure to US based asset management or fund administration firm will be an add on. Should have an understanding of data in BCM/WAM space, well versed with KDEs (Funds, Positions, Transactions, Trail Balance, Securities, Investor and etc) and their granularities. Should be strong in programming languages namely Python. Should have hands-on experience on Big Data tools namely PySpark, Sqoop, Hive and Hadoop Cluster. Should have hands-on experience on Cloud technologies, preferably Azure and experience in working with Azure Databricks. Should be an expert working on databases namely Oracle, SQL Server and exposure to Big Data is a plus. Knowledge on Data Visualization tools is a plus. Should be able to write programs to perform file/data validations, EDA and data cleansing. Should be highly data driven and able to write complex data transformation programs using PySpark, Python. Experience in data integration and data processing using Spark and Python. Hands-on experience in creating real time data streamin solutions using Spark Streaming, Flume. Experience in handling large data set (in terabytes) and writing Spark jobs and hive queries to perform data analysis. Experience working in an agile environment

Posted 1 week ago

Apply

2.0 - 5.0 years

16 - 17 Lacs

mumbai

Work from Office

Understand business requirements by engaging with business teams Data extraction from valuable data sources & automating data collection process Data processing, cleaning and validating integrity of data to be used for analysis Exploratory data analysis to identify trends and patterns in large amount of data Build machine learning based models using algorithms and statistical techniques like Regression, Decision trees, Boosting etc Present insights using data visualization techniques Propose solutions and strategies to various complex business challengesBuild GenAI models using RAG frameworks for chatbots, summarisation etc Develop model deployement pipline using Lambda, ECS etcSkills & AttributesKnowledge of statistical programming languages like R, Python and database query languages like SQL and statistical tests like distributions, regression etc Experience in data visualization tools like tableau, QlikSense etc Ability to write comprehensive reports, with an analytical mind and inclination for problem-solving Exposure in advanced techniques like GenAI, neural networks, NLP, image and speech processing Ability to engage with stakeholders to understand business requirements and convert the same into technical problems for solution development and deployment

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

pune

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : A Engineering Graduate preferably computer science graduate 15 years of full time Education Summary :As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will also perform maintenance, enhancements, and/or development work. This role requires a strong understanding of software development principles and the ability to work independently and as part of a team. You will have the opportunity to contribute to the success of our clients by delivering high-quality software solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to analyze, design, and develop software solutions.- Write clean, efficient, and maintainable code that meets the project requirements.- Perform unit testing and debugging to ensure the quality and stability of the software.- Participate in code reviews to provide feedback and ensure adherence to coding standards.- Identify and resolve technical issues and bugs in a timely manner.- Stay up-to-date with the latest industry trends and technologies to continuously improve skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with data processing frameworks like Apache Spark.- Strong understanding of software development principles and best practices.- Experience with distributed computing and parallel processing.- Knowledge of SQL and relational databases.- Familiarity with version control systems like Git.- Excellent problem-solving and analytical skills.- Ability to work in a fast-paced and dynamic environment. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Pune office.- A Engineering Graduate preferably computer science graduate with 15 years of full-time Education is required. Qualification A Engineering Graduate preferably computer science graduate 15 years of full time Education

Posted 1 week ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

coimbatore

Work from Office

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 1 week ago

Apply

5.0 - 7.0 years

6 - 10 Lacs

gurugram

Work from Office

We are seeking a highly skilled Technical Project Manager (TPM) to lead process optimization and automation efforts within our GIS team. The ideal candidate will be responsible for designing and implementing efficient workflows Required Candidate profile 5-7+ years of experience in technical project management, process optimization, or automation within data-intensive environments. Strong understanding of large-scale data workflows, data ingestion

Posted 1 week ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

gurugram

Work from Office

Overview: Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available , consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos , powering our future growth. Data & AI Engineers at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and AI experts to deliver high quality AI solutions and derived data to our consumers. We are looking for Data & AI Engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Engineers looking to work in the areas of orchestration, data modelling , data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities As a Data Engineer, you will be joining a Data & AI team transforming our global financial network and improving the quality of our products and services we provide to our customers. and you will be responsible for designing, implementing, and maintaining data pipelines and systems to support the organizations data needs. Your role will involve collaborating with data scientists, analysts, and other stakeholders to ensure data accuracy, reliability, and accessibility. Key Responsibilities: Data Pipeline Development: Design, build, and maintain scalable and efficient data pipelines to collect, process, and store structured and unstructured data from various sources. Data Integration: Integrate data from multiple sources such as databases, APIs, flat files, and streaming platforms into centralized data repositories. Data Modeling: Develop and optimize data models and schemas to support analytical and operational requirements. Implement data transformation and aggregation processes as needed. Data Quality Assurance: Implement data validation and quality assurance processes to ensure the accuracy, completeness, and consistency of data throughout its lifecycle. Performance Optimization: Monitor and optimize data processing and storage systems for performance, reliability, and cost-effectiveness. Identify and resolve bottlenecks and inefficiencies in data pipelines and leverage Automation and AI to improve overall Operations. Infrastructure Management: Manage and configure cloud-based or on-premises infrastructure components such as databases, data warehouses, compute clusters, and data processing frameworks. Collaboration: Collaborate with cross-functional teams including data scientists, analysts, software engineers, and business stakeholders to understand data requirements and deliver solutions that meet business objectives . Documentation and Best Practices: Document data pipelines, systems architecture, and best practices for data engineering. Share knowledge and provide guidance to colleagues on data engineering principles and techniques. Continuous Improvement: Stay updated with the latest technologies, tools, and trends in data engineering and recommend improvements to existing processes and systems. Qualifications and Skills: Bachelors degree or higher in Computer Science, Engineering, or a related field. Proven experience in data engineering or related roles, with a strong understanding of data processing concepts and technologies. Mastery of programming languages such as Python, Java, or Scala. Knowledge of database systems such as SQL, NoSQL, and data warehousing solutions. Knowledge of stream processing technologies such as Kafka or Apache Beam. Experience with distributed computing frameworks such as Apache Spark, Hadoop, or Apache Flink . Experience deploying pipelines in cloud platforms such as AWS, Azure, or Google Cloud Platform. Experience in implementing enterprise systems in production setting for AI, natural language processing. Exposure to self-supervised learning, transfer learning, and reinforcement learning is a plus . Have full stack experience to build the best fit solutions leveraging Large Language Models (LLMs) and Generative AI solutions with focus on privacy, security, fairness. Have good engineering skills to design the output from the AI with nodes and nested nodes in JSON or array, HTML formats for as-is consumption and display on the dashboards/portals. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience with containerization and orchestration tools such as Docker and Kubernetes. Familiarity with data visualization tools such as Tableau or Power BI.

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

chennai

Work from Office

Career Area: Technology, Digital and Data Job Description: Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, youre joining a global team who cares not just about the work we do but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We dont just talk about progress and innovation here we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Role Definition The AI Engineer will perform analytical tasks and initiatives on large datasets to support data-driven business decisions and development. This role involves leveraging AI technologies like GenAI and Deep Learning to enhance business processes and outcomes. Responsibilities Developing and implementing AI models, including deep learning and neural networks, using AI libraries such as Keras, TensorFlow, and PyTorch. Exploring and applying Generative AI (Gen AI) and Large Language Models (LLMs) to business problems. Designing and implementing data engineering pipelines to support machine learning workflows. Directing the data gathering, data mining, and data processing processes in large volumes; creating appropriate data models. Conducting research on data model optimisation and algorithms to improve effectiveness and accuracy in data analyses. Deploying machine learning models into production environments and ensuring their scalability and reliability. Performing hands-on coding and development to build and optimise AI solutions. Staying updated with the latest advancements in AI technologies and incorporating them into existing projects. Collaborating with cross-functional teams to identify opportunities for AI-driven innovation and improvement. Skill Descriptors Machine Learning : Knowledge of principles, technologies, and algorithms of machine learning; ability to develop, implement, and deliver related systems, products, and services. Completes specific tasks and initiatives utilising machine learning technologies, such as search engine optimisation. Utilises specific tools and techniques to process descriptive and inferential statistics. Applies specific computing languages and tools in machine learning, such as R and Python. Explores the use of machine learning in ones own areas to make business improvements. Conducts data mining and cleaning initiatives. Analytical Thinking : Knowledge of techniques and tools that promote effective analysis; ability to determine the root cause of organisational problems and create alternative solutions that resolve these problems. Approaches a situation or problem by defining the problem or issue and determining its significance. Makes a systematic comparison of two or more alternative solutions. Uses flow charts, Pareto charts, fish diagrams, etc. to disclose meaningful data patterns. Identifies the major forces, events, and people impacting and impacted by the situation at hand. Uses logic and intuition to make inferences about the meaning of the data and arrive at conclusions. Programming Languages : Knowledge of basic concepts and capabilities of programming; ability to use tools, techniques, and platforms in order to write and modify programming languages. Participates in the implementation and support of specialised programming languages. Conducts basic reviews on writing a specific programming language within a specific platform. Assists with the design and development of specialised programming languages. Follows an organisations standards, policies, and guidelines for structured programming specifications. Diagnoses and reports minor or routine programming language problems. AI and Deep Learning : AI experience with deep learning, neural networks, AI libraries like Keras, TensorFlow, PyTorch, and exposure to Generative AI (Gen AI) and Large Language Models (LLMs). Develops and implements deep learning models and neural networks. Utilises AI libraries such as Keras, TensorFlow, and PyTorch for model development. Explores and applies Generative AI and Large Language Models to solve business problems. Conducts research on advanced machine learning techniques to enhance model performance and accuracy. Preferred Languages and Tools Programming Languages : Python, SQL, AI Libraries : Keras, TensorFlow, PyTorch Data Processing Tools : Apache Spark Cloud Platforms : AWS Data Visualisation Tools : Power BI Version Control : Git/GitHub Relocation is available for this position.

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

bengaluru

Work from Office

GDS (Group Data Services) leads Swiss Res ambition to be a truly data-driven risk knowledge company. GDS bring expertise and experience covering all aspects of data and analytics to enable Swiss Re in its vision to make the world more resilient. The Data Platform Engineering team deals with data & analytics platforms for enabling the creation of innovative solutions to data driven business needs. It also enables Swiss Re Group to efficiently utilize the platforms and ensures their availability and proper functioning. The Opportunity Are you excited about the prospect of joining Swiss Res mission to become a truly data-driven risk company We are the GDS Machine Learning Operations team, and we are looking for a highly skilled and motivated Machine Learning Engineer to join us. In this role, you will be instrumental in building, optimizing, and maintaining our machine learning models. You will collaborate closely with our business groups to anticipate emerging client needs, respond to user inquiries, and resolve their issues. Your expertise will be crucial in empowering our users to adopt MLOps and LLMOps best practices, all while ensuring our systems remain both secure and cost-effective. As our new colleague, you will thrive in an agile environment, collaborating closely with peers, internal experts, and business clients to support, organize, and manage various activities within the team. Your contributions will be key to driving our data-driven initiatives forward and enhancing our risk management capabilities. What you will work on during your first year at Swiss Re: Key Responsibilities: Model Development and Maintenance: Design, develop, test deploy, retrain machine learning models and algorithms to solve complex business problems in collaboration with data scientists and other engineers. Data Processing: Implement big data processing workflows and pipelines to handle large-scale datasets efficiently. MLOps & LLMOps: Promote and implement MLOps and LLMOps best practices to streamline model deployment, monitoring, and maintenance. Platform Management : Maintain and enhance our data science and machine learning platforms to ensure high performance and reliability. Collaboration: Work closely with business stakeholders to understand their needs, provide insights, and deliver tailored ML solutions. Security & Cost Management: Ensure that all systems and solutions are secure and cost-effective. About You: Essentials The following list represents our ideal candidates profile. We know it is unlikely you meet 100% of our criteria. The more boxes you can check the better, however ultimately willingness to keep learning is key. Educational Background: Bachelors or Master s degree in Computer Science, Data Science, Machine Learning, or a related field preferred. Experience: 1 + years of proven experience in machine learning model development, deployment, and maintenance. Some experience with large language models is a plus. Technical Skills: Proficiency in Python, R , or similar programming languages. Experience with ML frameworks like TensorFlow, PyTorch, or Scikit-learn. Big Data Technologies: Familiarity with big data processing tools esp. Spark, or similar. MLOps & LLMOps: Knowledge of MLOps and LLMOps practices and tools such as Docker, Kubernetes, MLflow, etc. Palantir Foundry: Willingness to accustom yourself with Swiss Res strategic data management platform Palantir Foundry. Prior experience with the platform is a plus. Analytical Skills: Strong analytical and problem-solving skills with the ability to work with complex datasets. Communication: Excellent communication skills to interact effectively with both technical and non-technical stakeholders. Ideally you have some prior experience with the (re) insurance or financial services industry Behavioural Competences We are working as one team operating from three different countries: India, Slovakia and Switzerland. Our companys internal customer base is spread across all continents. Given, that you will interact with users every day, a strong customer-orientation, joy in communicating with users as well as a commitment to quality and timeliness are important! Furthermore, the entire field of technologies we are working with such as cloud technologies, Generative AI or Natural Language Processing is evolving rapidly. You should have the curiosity to explore modern technologies, and the persistence to make them accessible to a larger community even while they are still evolving. Last, but not least, enjoying our work is just as important as achieving results. About Swiss Re If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 135224

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

bengaluru

Work from Office

Job Purpose and Impact The Director, Data Engineering job leads a data engineering team responsible for the execution of the tactical and strategic plans related to design, development and maintenance of robust data pipelines and solutions. This job provides guidance to the team that ensures the efficient processing and availability of data for analysis and reporting. Key Accountabilities Establishes and maintains robust data systems that support large and complex data products, ensuring reliability and accessibility for partners. Leads the development of technical products and solutions using big data and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust. Oversees and guides the design and development of data pipelines that facilitate the movement of data from various sources to internal databases. Handles the construction and optimization of data infrastructure, resolving appropriate data formats to ensure data readiness for analysis. Examines and settles appropriate data formats to optimize data usability and accessibility across the organization. Liaises with partners to understand data needs and ensure alignment with organizational objectives. Champions development standards and brings forward prototypes to test new data framework concepts and architecture patterns supporting efficient data processing and analysis and promoting standard methodologies in data management. Leads the creation and maintenance of automated reporting systems that provide timely insights and facilitate data driven decision making. Oversees data modeling to ensure the preparation of data in databases for use in various analytics tools and to configurate and develop data pipelines to move and improve data assets. Manages team members to achieve the organization s goals, by ensuring productivity, communicating performance expectations, creating goal alignment, giving and seeking feedback, providing coaching, measuring progress and holding people accountable, supporting employee development, recognizing achievement and lessons learned, and developing enabling conditions for talent to thrive in an inclusive team culture. Qualifications Minimum requirement of 6 years of relevant work experience. Typically reflects 10 years or more of relevant experience. Preferred Work Experience Prior experience as a data/ software engineer performing data modeling and data pipeline engineering leveraging advanced cloud technologies and diverse coding languages Leading geographically distributed engineering teams across a large global organization Developing and managing strategic partnerships across both digital and business facing stakeholders Track record of leading architecture strategies and execution across a diverse digital and data technology landscape Experience developing and leading transformation strategies regarding to people, process, and technology Thorough understanding of industry trends and best practices related to data engineering of robust, performant, and cost effective solutions Proven record helping drive the adoption of new technologies and methods within the functional data and analytics team and be a role model and mentor for data engineers.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

bengaluru

Work from Office

ng-non-bindable> Description Xperi invents, develops and delivers technologies that create extraordinary experiences at home and on the go for millions of people around the world. Powering billions of consumer electronics, connected cars and digital content titles, we make entertainment more immersive, driving more intelligent and every interaction seamlessly personalized through our renowned consumer brands: DTS , HD Radio and TiVo . Xperi (NYSE: XPER) is a publicly traded technology company headquartered in San Jose, CA with over 2,000 employees across North America, Europe and Asia. Come join a thriving team where you can play an integral role in shaping the future of entertainment technology. About the role: Join a team that is dedicated to creating world-class development and deployment environments that are highly scalable and resilient. As Senior Software Engineer, TiVo, you ll play a truly rewarding part in delivering our Personalized Content Discovery (PCD) platform, an industry-leading Saas offering. Enjoy the opportunity to draw on your passion for problem-solving and simplifying tasks, as well as your technical skill set to make important contributions to the platform. You ll collaborate daily in an agile development environment with an extended team of experienced engineers. The PCD platform is central to the expanding TiVo Stream 4K product and powering several other video search and recommendations experiences for our partners around the world. The Senior Software Engineer is a key role in TiVo s growing and dynamic Discovery organization. The PCD team handles the challenges of creating scalable search frameworks and machine-learning models for our customers. This technical role is focused on developing and deploying cloud-based offerings using a wide range of tools and frameworks, automating operational tasks, improving personalization modeling and success, and working with development and operational teams to solve complex problems. Basic understanding of AI concepts or interest in AI tools and technologies. We re looking for individuals who are curious about AI, eager to learn, and aim to integrate it into their work in alignment with organizational policies. What you will get to do: Drive technical and architectural excellence across PCD offerings Imagine, design and develop new features for our search and recommendations platform Utilize and promote sound development practices (requirements gathering, design reviews, code reviews, retrospective meetings, etc.) Adhere to core design and testing principles set by team and group leadership Identify and automate repetitive operational tasks at all stages of the software lifecycle Build tools and systems to increase operational transparency and monitoring of SaaS products across Xperi Who we are looking for: Must have: Strong expertise in Java and OO design Experience designing and developing large software systems Experience with JSON and REST Nice to have: Stream processing (Kafka Streams) NoSQL databases, key-stores and other data-structure solutions (i.e., Dynamo, Cassandra, MongoDB) Continuous Integration platforms (Jenkins) Offline data processing (PySpark, Airflow) Virtualization and container orchestration (Docker, Kubernetes) Monitoring and logging tools (Prometheus, ELK) Virtual application and web servers (Apache, NGINX) Cloud infrastructure (AWS) What will make you successful: The ability to propose, design and develop solutions that scale Keen troubleshooting skills and practiced agile development methodology Excellent written and oral communication skills Expert problem-solving skills Life @ Xperi: At Xperi, we value People, Customers, Performance and Innovation. We are dedicated to creating a workplace where all employees have a voice and sense of belonging, feel safe and valued, and are acknowledged for how their unique differences contribute to organizational culture and business outcomes. Our employees and their families are important to us, and our comprehensive pay, stock and benefits programs reflect that. Xperi supports personal well-being, builds financial security and enables employees to share in our collective success. Rewards include:

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

pune

Work from Office

About us Maersk is a global leader in integrated logistics and has been an industry pioneer for over a century. Through innovation and transformation, we are redefining the boundaries of possibility, continuously setting new standards for efficiency, sustainability, and excellence. With over 100,000 employees across 130 countries, we work together to shape the future of global trade and logistics. Join us as we harness cutting-edge technologies and unlock opportunities on a global scale. Together, lets sail towards a brighter, more sustainable future with Maersk. Are you passionate about driving innovation and upskilling teams in the world of data analytics and reportingJoin our dynamic team as a C onsultant Reporting and Technology Enablement and play a pivotal role in enhancing our reporting capabilities while adopting cutting-edge technologies like Databricks. This is a unique opportunity to contribute to the development and success of finance reporting solutions for both headquarter and frontline teams. About the Role Success in this role will be defined by the ability to deliver impactful results, including increasing the number of automated reports, driving the adoption of innovative technologies, and reduction of the time and cost spent on reporting processes. As a consultant, you will focus on strengthening the technical capabilities of our reporting teams, leading impactful projects, and introducing innovative tools and methodologies. You will collaborate closely with the report development teams to deliver high-quality solutions while automating processes and ensuring efficiency across our financial reporting landscape. Key Responsibilities Team Upskilling and Mentorship Deliver targeted training sessions to enhance the skills of the reporting team in tools such as Power BI, Excel, Power Query, SQL, and Python. Mentor team members and share best practices to ensure the team s success in supporting the finance organization. End-to-End Project Ownership Lead the design, development, and delivery of reporting and analytics projects tailored to the needs of HQ and frontline finance teams. Manage all phases of project development, including gathering requirements, data modeling, visualization design, testing, and deployment. Engage with stakeholders on a project basis to ensure successful outcomes. Technology Adoption and Innovation Drive the adoption and integration of new technologies, such as Databricks, into reporting workflows to enhance data processing and analytics capabilities. Evaluate and recommend tools and solutions to improve reporting efficiency and enable advanced financial analytics. Serve as a subject matter expert for Power BI, Databricks, SQL, Python and emerging technologies. Automation and Maintenance Support Collaborate with the maintenance/run teams to automate and streamline the refresh and maintenance of reports, leveraging SQL and Python for optimized processes. Develop scalable solutions to improve the sustainability of reporting infrastructure. Troubleshoot and resolve technical issues, ensuring minimal disruption to operations. What We re Looking For Expertise in Power BI, Excel and Power Query with a strong focus on financial reporting and Business Intelligence (BI). You have experience writing scripts in SQL, Python, Scala, R, DAX and MDX. Proficiency in using Databricks, Dremio and other data technology platforms for advanced analytics and reporting. Experience in report automation and data pipeline optimization. Strong communication, problem-solving, and project management skills. A proactive and collaborative mindset, with the ability to work independently and in teams. Qualifications Master s degree in finance, Engineering, Technology, or a related field. Background in finance, data analytics, or business intelligence. Prior experience in training or upskilling teams. Familiarity with Agile or similar project management methodologies. What We Offer An opportunity to work with a forward-thinking team driving innovation in reporting. A supportive environment for professional growth and development. A chance to work with advanced technologies and make a tangible impact on financial reporting and performance management processes.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies