Jobs
Interviews

6633 Databricks Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

58.0 years

0 Lacs

Delhi, India

On-site

Job Summary We are looking for a skilled Data Modeler / Architect with 58 years of experience in designing, implementing, and optimizing robust data architectures in the financial payments industry. The ideal candidate will have deep expertise in SQL, data modeling, ETL/ELT pipeline development, and cloud-based data platforms such as Databricks or Snowflake. You will play a key role in designing scalable data models, orchestrating reliable data workflows, and ensuring the integrity and performance of mission-critical financial datasets. This is a highly collaborative role interfacing with engineering, analytics, product, and compliance teams. Key Responsibilities Design, implement, and maintain logical and physical data models to support transactional, analytical, and reporting systems. Develop and manage scalable ETL/ELT pipelines for processing large volumes of financial transaction data. Tune and optimize SQL queries, stored procedures, and data transformations for maximum performance. Build and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi. Architect data lakes and warehouses using platforms like Databricks, Snowflake, BigQuery, or Redshift. Enforce and uphold data governance, security, and compliance standards (e.g., PCI-DSS, GDPR). Collaborate closely with data engineers, analysts, and business stakeholders to understand data needs and deliver solutions. Conduct data profiling, validation, and quality assurance to ensure clean and consistent data. Maintain clear and comprehensive documentation for data models, pipelines, and architecture. Required Skills & Qualifications 58 years of experience as a Data Modeler, Data Architect, or Senior Data Engineer in the financial/payments domain. Advanced SQL expertise, including query tuning, indexing, and performance optimization. Proficiency in developing ETL/ELT workflows using tools such as Spark, dbt, Talend, or Informatica. Experience with data orchestration frameworks: Airflow, Dagster, Luigi, etc. Strong hands-on experience with cloud-based data platforms like Databricks, Snowflake, or equivalents. Deep understanding of data warehousing principles: star/snowflake schema, slowly changing dimensions, etc. Familiarity with financial data structures, such as payment transactions, reconciliation, fraud patterns, and audit trails. Working knowledge of cloud services (AWS, GCP, or Azure) and data security best practices. Strong analytical thinking and problem-solving capabilities in high-scale environments. Preferred Qualifications Experience with real-time data pipelines (e.g., Kafka, Spark Streaming). Exposure to data mesh or data fabric architecture paradigms. Certifications in Snowflake, Databricks, or relevant cloud platforms. Knowledge of Python or Scala for data engineering tasks (ref:hirist.tech)

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Bridgnext, you will be responsible for working on internal and customer-based projects. Your primary focus will be on ensuring the quality of the code and providing optimal solutions to meet client requirements while anticipating their future needs based on market understanding. Your experience with Hadoop projects, including data processing and representation using various AWS services, will be valuable in this role. You should have at least 4 years of experience in data engineering, with a specialization in big data technologies such as Spark and Kafka. A minimum of 2 years of hands-on experience with Databricks is essential for this position. A strong understanding of data architecture, ETL processes, and data warehousing is necessary, along with proficiency in programming languages like Python or Java. Experience with cloud platforms such as AWS, Azure, and GCP, as well as familiarity with big data tools, will be beneficial. Excellent communication, interpersonal, and leadership skills are required to effectively collaborate with team members and clients. You should be able to work in a fast-paced environment, managing multiple priorities efficiently. In addition to technical skills, you should possess solid written, verbal, and presentation communication abilities. Being a strong team player while also capable of working independently is crucial. Maintaining composure in various situations, collaborative nature, high standards of professionalism, and consistently delivering high-quality results are expected from you. Your self-sufficiency and openness to creative solutions will be key in addressing any challenges that may arise in the role.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Azure Data Engineer with expertise in Microsoft Fabric and modern data platform components, you will be responsible for designing, developing, and managing end-to-end data pipelines on Azure Cloud. Your primary focus will be on ensuring performance, scalability, and delivering business value through efficient data solutions. You will collaborate with various teams to define data requirements, implement data ingestion, transformation, and modeling pipelines supporting structured and unstructured data. Additionally, you will work with Azure Synapse, Data Lake, Data Factory, Databricks, and Power BI for seamless data integration and reporting. Your role will involve optimizing data performance and cost through efficient architecture and coding practices, ensuring data security, privacy, and compliance with organizational policies. Monitoring, troubleshooting, and improving data workflows for reliability and performance will also be part of your responsibilities. To excel in this role, you should have 5 to 7 years of experience as a Data Engineer, with at least 2+ years working on the Azure Data Stack. Hands-on experience with Microsoft Fabric, Azure Synapse Analytics, Data Factory, Data Lake, SQL Server, and Power BI integration is crucial. Strong skills in data modeling, ETL/ELT design, and performance tuning are required, along with proficiency in SQL and Python/PySpark scripting. Experience with CI/CD pipelines and DevOps practices for data solutions, understanding of data governance, security, and compliance frameworks, as well as excellent communication, problem-solving, and stakeholder management skills are essential for success in this role. A Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field is preferred. Having Microsoft Azure Data Engineer Certification (DP-203), experience in Real-Time Streaming (e.g., Azure Stream Analytics or Event Hub), and exposure to Power BI semantic models and direct lake mode in Microsoft Fabric would be advantageous. Join us to work with the latest in Microsoft's modern data stack - Microsoft Fabric, collaborate with a team of passionate data professionals, work on enterprise-grade, large-scale data projects, experience a fast-paced, learning-focused work environment, and have immediate visibility and impact in key business decisions.,

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Technical Lead in Data Engineering specializing in Python and Databricks, you will play a crucial role in leading the design, development, and support of data pipelines and analytics solutions. Your responsibilities will include architecting scalable and secure cloud-based data platforms, collaborating with cross-functional teams to understand requirements, and mentoring a team of engineers to foster technical excellence and continuous improvement. You will be expected to ensure best practices in coding, testing, and deployment, manage project timelines and stakeholder expectations, and integrate and optimize data workflows with Databricks and other platforms. Additionally, you will conduct code reviews, provide technical leadership in sprint planning and retrospectives, and have a strong focus on delivering high-quality solutions. To excel in this role, you should possess at least 7 years of experience in software/data engineering, with a strong proficiency in Python for data processing and automation. Hands-on experience with Databricks, including Spark, Delta Lake, and notebooks, is essential. A deep understanding of AWS services such as S3, Lambda, Glue, EC2, and IAM is also required. Experience with CI/CD pipelines, version control (Git), and infrastructure as code is crucial, along with a background of working in an Agile/Scrum environment. Proven experience in leading technical teams, excellent communication skills, and stakeholder management are key aspects of this role. While not mandatory, it would be beneficial to have AWS certification (e.g., Solutions Architect, Data Analytics), familiarity with Snowflake for data warehousing and analytics, and knowledge of data governance, security, and compliance. Your expertise in Python, AWS, Databricks, and CI/CD pipelines will be instrumental in driving the success of data engineering projects.,

Posted 2 days ago

Apply

3.0 years

0 Lacs

India

Remote

Remote Role Job Summary: We are looking for a Business Intelligence (BI) Developer with a strong background in building insightful, scalable, and user-friendly dashboards and reports. While experience with ThoughtSpot experience is preferred, we welcome candidates with solid expertise in Tableau or Power BI who are fast learners and have a proven ability to create complex, comprehensive reporting solutions. Key Responsibilities: Design, develop, and maintain interactive dashboards and reports using ThoughtSpot, Tableau, or Power BI. Collaborate with business stakeholders to understand reporting needs and translate them into effective data visualizations. Optimize data models and queries to ensure performance and scalability. Work closely with data engineering teams to ensure data integrity and availability. Support self-service analytics by enabling business users through training and documentation. Continuously improve reporting standards, templates, and best practices. Required Qualifications: 3+ years of experience in BI/reporting tools such as ThoughtSpot, Tableau, or Power BI. Proficiency in SQL and understanding of data modeling concepts. Demonstrated ability to build complex dashboards and reports that drive business insights. Strong analytical and problem-solving skills. Excellent communication skills and ability to work cross-functionally. Preferred Qualifications: Hands-on experience with ThoughtSpot (e.g., Liveboards, SpotIQ, Worksheets). Familiarity with cloud data platforms (e.g., Databricks, Snowflake). Experience working in Agile or fast-paced environments. Exposure to scripting or automation tools (e.g., Python, dbt) is a plus.

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As a leading financial services and healthcare technology company based on revenue, SS&C is headquartered in Windsor, Connecticut, and has 27,000+ employees in 35 countries. Some 20,000 financial services and healthcare organizations, from the world's largest companies to small and mid-market firms, rely on SS&C for expertise, scale, and technology. We are hiring for the position of Quant Developer at Associate Manager/Manager level for SS&C GlobeOp Financial Services with office locations in Mumbai, Hyderabad, Pune, and Gurgaon. The ideal candidate should have experience in agile/scrum project management along with strong proficiency in Python, SQL, and KDB. Additional experience in Databricks, Fabric, PySpark/Tensorflow, C/C++, and other data management tools such as Arctic, Mongo, and Dashboarding will be considered a plus. Hedge fund experience is also desirable for this role. Interested candidates are encouraged to apply directly to SS&C Technologies, Inc. or its affiliated companies. Please note that unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services will not be accepted unless explicitly requested or approached by the company.,

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in BI and Analytics (Alteryx, SQL, Tableau), and you have found the right team. As a Business Intelligence Developer Associate within our Asset and Wealth Management Finance Transformation and Analytics team, you will be tasked with defining, refining, and achieving set objectives for our firm on a daily basis. You will be responsible for designing the technical and information architecture for the MIS (DataMarts) and Reporting Environments. Additionally, you will support the MIS team in query optimization and deployment of BI technologies, including but not limited to Alteryx, Tableau, MS SQL Server (T-SQL programming), SSIS, and SSRS. You will scope, prioritize, and coordinate activities with the product owners, design and develop complex queries for data inputs, and work on agile improvements by sharing experiences and knowledge with the team. Furthermore, you will advocate and steer the team to implement CI/CD (DevOps) workflow and design and develop complex dashboards from large and/or different data sets. The ideal candidate for this position will be highly skilled in reporting methodologies, data manipulation & analytics tools, and have expertise in the visualization and presentation of enterprise data. Required qualifications, capabilities, and skills include a Bachelor's Degree in MIS, Computer Science, or Engineering. A different field of study with significant professional experience in BI Development is also acceptable. Strong DW-BI skills are required with a minimum of 7 years of experience in Data warehouse and visualization. You should have strong work experience in data wrangling tools like Alteryx and working proficiency in Data Visualizations Tools, including but not limited to Alteryx, Tableau, MS SQL Server (SSIS, SSRS). Working knowledge in querying data from databases such as MS SQL Server, Snowflake, Databricks, etc., is essential. You must have a strong knowledge of designing database architecture, building scalable visualization solutions, and the ability to write complicated yet efficient SQL queries and stored procedures. Experience in building end-to-end ETL processes, working with multiple data sources, handling large volumes of data, and converting data into information is required. Experience in the end-to-end implementation of Business Intelligence (BI) reports & dashboards, as well as good communication and analytical skills, are also necessary. Preferred qualifications, capabilities, and skills include exposure to Data Science and allied technologies like Python, R, etc. Exposure to automation tools like UIPath, Blue Prism, Power Automate, etc., working knowledge of CI/CD workflows and automated deployment, and experience with scheduling tools like Control M are considered advantageous.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Business Intelligence Specialist at Adobe, you will have the opportunity to work closely with Business analysts to understand design specifications and translate requirements into technical models, dashboards, reports, and applications. Your role will involve collaborating with business users to cater to their ad-hoc requests and deliver scalable solutions on MSBI platforms. You will be responsible for system integration of data sources, creating technical documents, and ensuring data and code quality through standard methodologies and processes. To succeed in this role, you should have at least 3 years of experience in SSIS, SSAS, Data Warehousing, Data Analysis, and Business Intelligence. You should also possess advanced proficiency in Data Warehousing tools and technologies, including databases, SSIS, and SSAS, along with in-depth understanding of Data Warehousing principles and Dimensional Modeling techniques. Hands-on experience in ETL processes, database optimization, and query tuning is essential. Familiarity with cloud platforms such as Azure and AWS, as well as Python or PySpark and Databricks, would be beneficial. Experience in creating interactive dashboards using Power BI is an added advantage. In addition to technical skills, strong problem-solving and analytical abilities, quick learning capabilities, and excellent communication and presentation skills are important for this role. A Bachelor's degree in Computer Science, Information Technology, or an equivalent technical discipline is required. At Adobe, we value a free and open marketplace for all employees and provide internal growth opportunities for your career development. We encourage creativity, curiosity, and continuous learning as part of your career journey. To prepare for internal opportunities, update your Resume/CV and Workday profile, explore the Internal Mobility page on Inside Adobe, and check out tips to help you prep for interviews. The Talent Team will reach out to you within 2 weeks of applying for a role via Workday, and if you move forward in the interview process, inform your manager for support in your career growth. Join Adobe to work in an exceptional environment with colleagues committed to helping each other grow through ongoing feedback. If you are looking to make an impact and grow your career, Adobe is the place for you. Discover more about employee experiences on the Adobe Life blog and explore the meaningful benefits we offer. For any accommodation needs during the application process, please contact accommodations@adobe.com.,

Posted 2 days ago

Apply

6.0 - 10.0 years

0 Lacs

delhi

On-site

You will be responsible for leading and mentoring a team of data engineers to ensure high-quality delivery across various projects. Your role will involve designing, building, and optimizing large-scale data pipelines and integration workflows using Azure Data Factory (ADF) and Synapse Analytics. Additionally, you will be tasked with architecting and implementing scalable data solutions on Azure cloud, leveraging tools such as Databricks and Microsoft Fabric. Writing efficient and maintainable code using PySpark and SQL for data transformations will be a key part of your responsibilities. Collaboration with data architects, analysts, and business stakeholders to define data strategies and requirements is crucial. You will also be expected to implement and promote Data Mesh principles within the organization, provide architectural guidance, and offer solutions for new and existing data projects on Azure. Ensuring data quality, governance, and security best practices are followed, and staying updated with evolving Azure services and data technologies are essential aspects of the role. In terms of required skills and experience, you should possess at least 6 years of professional experience in data engineering and solution architecture. Expertise in Azure Data Factory (ADF) and Azure Synapse Analytics is necessary. Strong hands-on experience with Databricks, PySpark, and advanced SQL is also expected. A good understanding of Microsoft Fabric and its use cases, along with a deep knowledge of Azure cloud services related to data storage, processing, and integration, will be beneficial. Familiarity with Data Mesh architecture and distributed data product ownership is desirable. Strong problem-solving and debugging skills, as well as excellent communication and stakeholder management abilities, are essential for this role. It would be advantageous to have experience with CI/CD pipelines for data solutions, knowledge of data security and compliance practices on Azure, and a certification in Azure Data Engineering or Solution Architecture.,

Posted 2 days ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a highly skilled and experienced Data Architect to join our team. With at least 15 years of experience in Data engineering and Analytics, the ideal candidate will have a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will play a key role in designing, creating, deploying, and managing Blackbaud's data architecture. This position holds significant technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. You will act as an evangelist for proper data strategy across various teams within Blackbaud, and provide technical guidance, particularly in the area of data, for other projects. Responsibilities: - Develop and direct the strategy for all aspects of Blackbaud's Data and Analytics platforms, products, and services. - Set, communicate, and facilitate technical direction for the AI Center of Excellence and beyond collaboratively. - Design and develop innovative products, services, or technological advancements in the Data Intelligence space to drive business expansion. - Collaborate with product management to create technical solutions that address customer business challenges. - Take ownership of technical data governance practices to ensure data sovereignty, privacy, security, and regulatory compliance. - Challenge existing practices and drive innovation in the data space. - Create a data access strategy to securely democratize data and support research, modeling, machine learning, and artificial intelligence initiatives. - Contribute to defining tools and pipeline patterns used by engineers and data engineers for data transformation and analytics support. - Work within a cross-functional team to translate business requirements into data architecture solutions. - Ensure that data solutions prioritize performance, scalability, and reliability. - Mentor junior data architects and team members. - Stay updated on technology trends such as distributed computing, big data concepts, and architecture. - Advocate internally for the transformative power of data at Blackbaud. Required Qualifications: - 15+ years of experience in data and advanced analytics. - Minimum of 8 years of experience with data technologies in Azure/AWS. - Proficiency in SQL and Python. - Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. - Familiarity with Databricks, Microsoft Fabric. - Strong grasp of data modeling, data warehousing, data lakes, data mesh, and data products. - Experience with machine learning. - Excellent communication and leadership abilities. Preferred Qualifications: - Experience with .Net/Java and Microservice Architecture.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Portfolio Data Analyst at Addepar, you will play a crucial role in integrating client portfolio data into our leading portfolio management products. Your responsibilities will include analyzing and onboarding portfolio data from various sources, partnering with internal teams to deliver high-quality data solutions, and working with 3rd party data providers to support data integration into our platform. You will collaborate with engineering, product management, and data operations to ensure timely and reliable data solutions that meet client needs. Your role will involve developing data aggregations and functionality based on user workflow needs, as well as working on initiatives to improve overall data management and integration. You will also contribute to the evolution of Addepar's financial concordance solutions to better serve clients in wealth management and beyond. To excel in this role, you should have a minimum of 3+ years of experience working with financial data and concepts relevant to Addepar's clients and products, particularly in wealth management and portfolio management. Technical skills in tools like Excel, SQL, Python, pyspark, Databricks, or other financial services systems are preferred. Strong communication and interpersonal skills are essential for working effectively with vendors, clients, and internal partners. This position requires you to work from Addepar's Pune office three days a week as part of a hybrid work model. Addepar values a diverse and inclusive workplace, where individuals from different backgrounds and identities come together to drive innovative solutions. As an equal opportunity employer, Addepar is committed to promoting a welcoming environment where inclusion and belonging are shared responsibilities. If you are passionate about finance and technology, enjoy solving complex problems in investment management, and have experience in data analysis workflows and tools, this role offers you the opportunity to build on your existing expertise in the investment domain.,

Posted 2 days ago

Apply

4.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

At PwC, the focus in risk and compliance is on maintaining regulatory compliance and managing risks for clients by providing advice and solutions. The goal is to help organizations navigate complex regulatory landscapes and enhance internal controls to mitigate risks effectively. As part of the enterprise risk management team at PwC, you will be responsible for identifying and mitigating potential risks that could impact an organization's operations and objectives. Your role will involve developing business strategies to effectively manage and navigate risks in a rapidly changing business environment. Your primary focus will be on building meaningful client connections, learning how to manage and inspire others, and growing your personal brand. You will navigate complex situations, deepen your technical expertise, and become more aware of your strengths. Anticipating the needs of your teams and clients, delivering quality, and embracing ambiguity are key aspects of this role. You should be comfortable when the path forward isn't clear, ask questions, and view such moments as opportunities for growth. To lead and deliver value effectively at this level, you should possess a range of skills, knowledge, and experiences, including but not limited to: - Responding effectively to diverse perspectives, needs, and feelings of others. - Using a broad range of tools, methodologies, and techniques to generate new ideas and solve problems. - Employing critical thinking to break down complex concepts. - Understanding the broader objectives of your project or role and how your work aligns with the overall strategy. - Developing a deeper understanding of the business context and its evolving nature. - Using reflection to enhance self-awareness, strengths, and development areas. - Interpreting data to derive insights and recommendations. - Upholding professional and technical standards, the Firm's code of conduct, and independence requirements. As a Senior Associate at PwC Acceleration Centers (ACs), you will play a pivotal role in supporting various services, from Advisory to Assurance, Tax, and Business Services. Engaging in challenging projects and providing distinctive services to support client engagements will be part of your responsibilities. You will also participate in dynamic and digitally enabled training to enhance your technical and professional skills. In the OFRO - QA team, you will be responsible for maintaining the quality and accuracy of dashboards and data workflows through meticulous testing and validation. Leveraging your knowledge in data analysis and automation testing, you will mentor others, navigate complex testing environments, and uphold quality standards throughout the software development lifecycle. This role offers an exciting opportunity to work with advanced BI tools and contribute to continuous improvement initiatives in a dynamic team setting. Key Responsibilities: ETL Development & Data Engineering - Design, build, and maintain scalable ETL pipelines using Azure Data Factory, Databricks, and custom Python scripts. - Integrate and ingest data from on-prem, cloud, and third-party APIs into modern data platforms. - Perform data cleansing, validation, and transformation to ensure data quality and consistency. - Machine learning experience is desirable. Programming and Scripting - Write robust and reusable Python scripts for data processing, automation, and orchestration. - Develop complex SQL queries for data extraction, transformation, and reporting. - Optimize code for performance, scalability, and maintainability. Cloud & Platform Integration - Work within Azure ecosystems, including Blob Storage, SQL Database, ADF, Synapse, and Key Vault. - Utilize Databricks (PySpark/Delta Lake) for advanced transformations and big data processing. - PowerBI hands-on experience is a plus. Collaboration And Communication - Collaborate closely with cross-functional teams to ensure quality throughout the software development lifecycle. - Provide regular status updates and test results to stakeholders. - Participate in daily stand-ups, sprint planning, and Agile ceremonies. Shift time: 2pm to 11pm IST Total experience required: 4-9 years,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

The Content and Data Analytics team is an integral part of Global Operations at Elsevier, within the DataOps division. The team primarily provides data analysis services using Databricks, catering to product owners and data scientists of Elsevier's Research Data Platform. Your work in this team will directly contribute to the development of cutting-edge data analytics products for the scientific research sector, including renowned products like Scopus and SciVal. As a Data Analyst II, you are expected to possess a foundational understanding of best practices and project execution, with supervision from senior team members. Your responsibilities will include generating basic insights and recommendations within your area of expertise, supporting analytics team members, and gradually taking the lead on low complexity analytics projects. Your role will be situated within DataOps, supporting data scientists working within the Domains of the Research Data Platform. The Domains are functional units responsible for delivering various data products through data science algorithms, presenting you with a diverse range of analytical activities. Tasks may involve delving into extensive datasets to address queries, conducting large-scale data preparation, evaluating data science algorithm metrics, and more. To excel in this role, you must possess a sharp eye for detail, strong analytical skills, and proficiency in at least one data analysis system. Curiosity, dedication to quality work, and an interest in the scientific research realm and Elsevier's products are essential. Effective communication with stakeholders worldwide is crucial, hence a high level of English proficiency is required. Requirements for this position include a minimum of 3 years of work experience, coding proficiency in a programming language (preferably Python) and SQL, familiarity with string manipulation functions like regex, prior exposure to data analysis tools such as Pandas or Apache Spark/Databricks, knowledge of basic statistics relevant to data science, and familiarity with visualization tools like Tableau/Power BI. Furthermore, experience with Agile tools like JIRA is advantageous. Stakeholder management skills are crucial, involving building strong relationships with Data Scientists and Product Managers, aligning activities with their goals, and presenting achievements and project updates effectively. In addition to technical competencies, soft skills like effective collaboration, proactive problem-solving, and a drive for results are highly valued. Key results for this role include understanding task requirements, data gathering and refinement, interpretation of large datasets, reporting findings through effective storytelling, formulating recommendations, and identifying new opportunities. Elsevier promotes a healthy work-life balance with various well-being initiatives, shared parental leave, study assistance, and sabbaticals. The company offers comprehensive health insurance, flexible working arrangements, employee assistance programs, and modern family benefits to support employees" holistic well-being. As a global leader in information and analytics, Elsevier plays a pivotal role in advancing science and healthcare outcomes. Your work with the company contributes to addressing global challenges and fostering a sustainable future through innovative technologies and impactful partnerships. Elsevier is committed to a fair and accessible hiring process. If you require accommodations or adjustments due to a disability or other needs, please notify the company. Furthermore, be cautious of potential scams during your job search and familiarize yourself with the Candidate Privacy Policy for a secure application process. For US job seekers, it's important to know your rights regarding Equal Employment Opportunity laws.,

Posted 2 days ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Job Description: · Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. · Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. · Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. · Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. · Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. · Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks · Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained · Working with other members of the project team to support delivery of additional project components (API interfaces) · Evaluating the performance and applicability of multiple tools against customer requirements · Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. · Integrate Databricks with other technologies (Ingestion tools, Visualization tools). · Proven experience working as a data engineer · Highly proficient in using the spark framework (python and/or Scala) · Extensive knowledge of Data Warehousing concepts, strategies, methodologies. · Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). · Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics · Experience in designing and hands-on development in cloud-based analytics solutions. · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. · Designing and building of data pipelines using API ingestion and Streaming ingestion methods. · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. · Thorough understanding of Azure Cloud Infrastructure offerings. · Strong experience in common data warehouse modelling principles including Kimball. · Working knowledge of Python is desirable · Experience developing security models. · Databricks & Azure Big Data Architecture Certification would be plus · Must be team oriented with strong collaboration, prioritization, and adaptability skills required Mandatory skill sets: Azure Databricks Preferred skill sets: Azure Databricks Years of experience required: 7-10 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

As a Data Modeler specializing in Hybrid Data Environments, you will play a crucial role in designing, developing, and optimizing data models that facilitate enterprise-level analytics, insights generation, and operational reporting. You will collaborate with business analysts and stakeholders to comprehend business processes and translate them into effective data modeling solutions. Your expertise in traditional data stores such as SQL Server and Oracle DB, along with proficiency in Azure/Databricks cloud environments, will be essential in migrating and optimizing existing data models. Your responsibilities will include designing logical and physical data models that capture the granularity of data required for analytical and reporting purposes. You will establish data modeling standards and best practices to maintain data architecture integrity and collaborate with data engineers and BI developers to ensure data models align with analytical and operational reporting needs. Conducting data profiling and analysis to understand data sources, relationships, and quality will inform your data modeling process. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field, along with a minimum of 5 years of experience in data modeling. Proficiency in SQL, familiarity with data modeling tools, and understanding of Azure cloud services, Databricks, and big data technologies are essential. Your ability to translate complex business requirements into effective data models, strong analytical skills, attention to detail, and excellent communication and collaboration abilities will be crucial in this role. In summary, as a Data Modeler for Hybrid Data Environments, you will drive the development and maintenance of data models that support analytical and reporting functions, contribute to the establishment of data governance policies and procedures, and continuously refine data models to meet evolving business needs and leverage new data modeling techniques and cloud capabilities.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Are you intellectually curious and passionate about promoting solutions across organizational boundaries Join the Consumer & Community Banking (CCB) Stress Testing Transformation team for a dynamic opportunity to design and build creative solutions for the future of stress testing and annual CCAR exercises. As a Senior Associate in the Stress Testing Transformation Solution team, you will be a strategic thinker who is passionate about designing and building creative solutions for the future of Stress Testing. You will spend your time solving complex problems, demonstrating strategic thought leadership, and designing the way our stakeholders operate. By leveraging a deep understanding of CCB Stress Testing processes and extensive Finance domain knowledge, you will build scalable solutions that optimize process efficiencies, use data assets effectively, and advance platform capabilities. Responsibilities: - Collaborate with cross-functional teams to lead the design and implementation of end-to-end solutions for Stress Testing, addressing business problems with various technical solutions. - Provide expertise in process re-engineering and guidance based on the roadmap for large-scale Stress Testing transformation initiatives. - Assess, challenge, and provide solutions for Stress Testing processes, focusing on data sources, with the ability to influence and drive the roadmap. - Evaluate, recommend, and develop solutions and architecture, including integration with APIs, Python, AI/ML technology, and other enterprise applications. - Leverage data and best-in-class tools to improve processes and controls, enable cross-business applications, and embrace a consistent framework. - Simplify complex issues into manageable steps and achievements. - Eliminate manual reporting, reengineer processes, and increase the ability to generate insights faster through an integrated data and platform approach. Required Qualifications: - Bachelor's degree in engineering or a related field. - Experience with business intelligence, analytics, and data wrangling tools such as Alteryx, SAS, or Python. - Experience with relational databases, optimizing SQL to extract and summarize large datasets, report creation, and ad-hoc analyses. - Experience with Hive, Spark SQL, Impala, or other big-data query tools. - Ability to understand the underlying business context beyond raw data and identify business opportunities hidden in data. - Collaborative skills to work with global teams in a fast-paced, results-driven environment. - Strong problem-solving and analytical skills with a transformation mindset. Preferred Qualifications: - Experience with Databricks, SQL, Python, or other data platforms. - 8+ years of experience in Analytics Solution and Data Analytics, preferably related to the financial services domain.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Are you intellectually curious and passionate about promoting solutions across organizational boundaries Join the Consumer & Community Banking (CCB) Stress Testing Transformation team for an exciting opportunity to design and implement creative solutions for the future of stress testing and annual CCAR exercises. As an Associate in the Stress Testing Transformation Solution team, you will be a strategic thinker dedicated to crafting innovative solutions for the future of Stress Testing. You will be engaged in solving complex problems, showcasing strategic thought leadership, and reshaping the operational processes of our stakeholders. By leveraging a profound understanding of CCB Stress Testing processes and extensive knowledge in the Finance domain, you will develop scalable solutions that enhance process efficiencies, optimize data asset utilization, and enhance platform capabilities. Your responsibilities will include collaborating with cross-functional teams to lead the design and implementation of end-to-end solutions for Stress Testing, identifying and addressing business challenges with various technical solutions. You will contribute your expertise in process re-engineering and offer guidance for significant Stress Testing transformation initiatives. Furthermore, you will assess, challenge, and propose solutions for the end-to-end Stress Testing process, focusing on data sources and influencing the roadmap. Proactively learning new technologies to evaluate and recommend solutions and architectures, including integration with APIs, Python, and AI/ML technologies with other enterprise applications will be a key aspect of your role. Additionally, you will leverage your business knowledge and expertise in CCAR, Stress Testing, and forecasting to drive process transformation. By simplifying complex issues into manageable steps or achievements, you will play a pivotal role in the team's success. Ensuring strong controls in close collaboration with internal functions and in compliance with company policies will be essential. **Required qualifications, capabilities, and skills:** - Bachelor's degree in finance or related field and/or CA/ CFA/ MBA / PGDBM from a top-tier institute - 8+ years of experience in Analytics Solution, Data Analytics, or Planning & Analysis roles - Profound knowledge of Financial Planning, forecasting, Stress testing/CCAR, and data sources used in these processes - Experience with Databricks, SQL, Python, or other data platforms - Proficiency in modeling data and utilizing data transformation tools on large datasets - Ability to collaborate with global teams and excel in a fast-paced, results-driven environment - Possess a transformation mindset with strong strategic thinking, problem-solving, and analytical skills **Preferred qualifications, capabilities, and skills:** - Program Management experience, including transformation, problem-solving, and analytical skills,

Posted 2 days ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Forma.ai Forma.ai is a Series B startup that's revolutionizing how sales compensation is designed, managed and optimized. We handle billions in annual managed commissions for market leaders like Edmentum, Stryker, and Autodesk. Our growth has been fuelled by our passion for fundamentally changing and shaping how companies use sales intelligence to drive business strategy. We’re welcoming equally driven individuals who are excited about creating something big! About The Team The Customer Operations team is at the heart of Forma.ai's mission. This team has a direct impact on the growth of Forma.ai. They are results-driven and solutions-minded. The Customer Operations team works closely with our customers, helping them to understand and take advantage of all the features Forma.ai offers and ensuring that they get the most value from the platform. What You'll Be Doing Reporting & Dashboarding Design, maintain, and enhance dashboards in BI tools (e.g., Looker Studio, Salesforce/HubSpot reports) to monitor marketing campaign performance, sales pipeline health, lead flow, and conversion metrics Automate recurring reports and implement self-serve analytics capabilities for GTM teams Data Analysis & Insights Analyze funnel performance from top-of-funnel marketing campaigns to bottom-of-funnel sales outcomes Provide regular insights into key KPIs like campaign ROI, customer acquisition cost (CAC), and attribution across channels Support A/B testing initiatives, sales activity analysis, and segmentation strategies Scripting & Automation Write Python and SQL scripts to extract, clean, and structure data from external sources (e.g., job boards, press releases, M&A feeds, web scraping APIs) Build automated enrichment pipelines to augment CRM and marketing data with third-party insights (e.g., firmographics, hiring activity, technology stack, funding events) Data Quality & Tooling Take ownership of data cleanliness and integrity across GTM systems (e.g., Salesforce, HubSpot, Databricks), including investigating issues, proposing solutions, and manually resolving historical data problems when necessary Maintain and improve key GTM logic — including lead scoring models, lifecycle stage transitions, attribution frameworks, and related automation rules — ensuring they are well-defined, consistent, and actionable What We're Looking For Background in Engineering, Commerce, Mathematics and/or Statistics Natural curiosity about AI and emerging technologies — especially where they intersect with automation, data, and workflow orchestration across the GTM stack Familiarity with B2B go-to-market motions and how to measure their effectiveness Experience in report building, data analysis and workflow automation within a GTM tech stack (e.g. Salesforce, HubSpot, Gong, BI tools) Proficiency in SQL or Python and familiarity with at least one BI tool (e.g., Power BI, Looker, Tableau) 3-5 years of related experience High achiever with a strong sense of ownership Ability to take ownership and run tasks in a fast-paced and evolving environment Our Values Work well, together. We’re real. We have kids and pets. Mortgages and student loans. We’re in this together, so no matter how brilliant any one of us is, we always play nice with one another – no exceptions. Be precise. Be relentless. We believe complacency breeds failure, so we set new goals as quickly as we achieve them. We persist in the face of adversity, learn from our mistakes, and push each other to continuously improve. The status-quo is kryptonite. Love our tech. Love our customers. Our platform solves a very complex problem in a currently underserved market. While everyone at Forma isn’t customer-facing, we’re all customer-focused. Maybe even slightly customer-obsessed. ­ Our Commitment To You We know that applying to a new role takes a lot of effort. You're encouraged to apply even if your experience doesn't precisely match the job description. There are many paths to a successful career and we’re looking forward to reading yours. We thank all applicants for their interest.

Posted 2 days ago

Apply

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

Join Amgen's Mission of Serving Patients At Amgen, you will be part of something bigger, driven by our shared mission to serve patients living with serious illnesses. Since 1980, Amgen has been a pioneer in the world of biotech, focusing on four therapeutic areas: Oncology, Inflammation, General Medicine, and Rare Disease, reaching millions of patients annually. As a member of the Amgen team, you will contribute to researching, manufacturing, and delivering innovative medicines that help people live longer, fuller, and happier lives. Our award-winning culture is collaborative, innovative, and science-based. If you are passionate about challenges and the opportunities they present, you will thrive as part of the Amgen team. Join us to make a lasting impact on patients" lives and transform your career. Data Scientist (US Value & Access Insights) In this vital role at Amgen, you will develop and deploy advanced machine learning, operational research, semantic analysis, and statistical methods to uncover structure in large data sets. Your responsibilities will include creating analytics solutions to address customer needs and opportunities. Roles & Responsibilities: - Ensuring that models are trained with the latest data and meet SLA expectations - Acting as a subject matter expert in solving development and commercial questions - Collaborating with a global cross-functional team on the AI tools roadmap - Working in technical teams on the development, deployment, and application of applied analytics, predictive analytics, and prescriptive analytics - Utilizing technical skills such as hypothesis testing, machine learning, and retrieval processes for statistical and data mining techniques - Performing exploratory and targeted data analyses using descriptive statistics and other methods - Collaborating with technical teams to translate business needs into technical specifications, focusing on AI-driven automation and insights - Developing and integrating custom applications, intelligent dashboards, and automated workflows incorporating AI capabilities to enhance decision-making and efficiency What We Expect Of You Basic Qualifications: - Masters degree with 1 to 3 years of computer science, statistics, or STEM majors with at least 1 year of Information Systems experience - Bachelors degree with 3 to 5 years of computer science, statistics, or STEM majors with at least 2 years of Information Systems experience - Diploma with 7 to 9 years of computer science, statistics, or STEM majors with at least 2 years of Information Systems experience - Experience with analytic software tools or languages like R and Python - Strong foundation in machine learning algorithms and techniques - Experience in statistical techniques, hypothesis testing, regression analysis, clustering, and classification Preferred Qualifications: - Experience in MLOps practices and tools (e.g., MLflow, Kubeflow, Airflow) - Proficiency in Python and relevant ML libraries (e.g., TensorFlow, PyTorch, Scikit-learn) - Outstanding analytical and problem-solving skills, ability to learn quickly, excellent communication and interpersonal skills - Experience with data engineering, pipeline development, NLP techniques for text analysis, sentiment analysis, time-series data analysis, forecasting, trend analysis, AWS, Azure, Google Cloud, Databricks platform for data analytics, and MLOps Professional Certifications: - Any AWS Developer certification (preferred) - Any Python and ML certification (preferred) Soft Skills: - Initiative to explore alternate technology and approaches to problem-solving - Skilled in breaking down problems, documenting problem statements, and estimating efforts - Excellent analytical and troubleshooting skills - Strong verbal and written communication skills - Ability to work effectively with global, virtual teams - High degree of initiative and self-motivation - Ability to manage multiple priorities successfully - Team-oriented, with a focus on achieving team goals What You Can Expect Of Us At Amgen, we support your professional and personal growth and well-being as we develop treatments to take care of others. We offer competitive benefits and a collaborative culture to support your journey every step of the way. Apply now for a career that defies imagination. Objects in your future are closer than they appear. Join us at careers.amgen.com. As an organization dedicated to improving the quality of life globally, Amgen fosters an inclusive environment of diverse, ethical, committed, and highly accomplished individuals who respect each other and live by Amgen values to advance science in serving patients. We ensure individuals with disabilities are provided reasonable accommodation throughout the job application process. Contact us to request accommodation.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As an Angular FullStack Developer with 4 to 6 years of experience in Mumbai, working in a Hybrid mode, your daily responsibilities will involve collaborating with fellow developers, business analysts, and Product Owners. You will be engaged in researching, designing, implementing, testing, and assessing both new and existing software solutions. Additionally, you will be responsible for the maintenance and enhancement of current systems, development of monitoring tools, and documenting application processes for future reference. Exploring new technologies and contributing to the development and upkeep of automated deployment infrastructure will also be part of your role. Attending internal and external training sessions to enhance your technical knowledge and skills is vital for your professional growth. To excel in this position, you should have a solid understanding and at least 2 years of practical experience in middleware development using Java & Databases. Prior exposure to Scripting/Python will be advantageous. Familiarity with UI technologies like Angular and analytics tools such as Power BI will be beneficial. Proficiency in generic testing frameworks, adherence to good programming practices, and a preference for clean, maintainable, well-documented, and reusable code are essential. You should also be well-versed in CI/CD practices and possess a keen interest in ensuring rapid and secure code deployment through a stable, thoroughly tested, and risk-aware approach. A curious mindset and openness to acquiring knowledge of new languages and technologies are key attributes for success in this role. The ideal candidate for this position will have hands-on experience in Java, Angular, and Databricks, along with familiarity with Spark for managing large-scale data. Proficiency in writing and handling database queries using SQL, utilizing GIT for version control, writing tests, and employing testing tools are necessary skills. Experience with CI/CD pipelines for efficient and safe code deployment, comfort in Agile/Scrum environments, and the ability to produce clean, maintainable code are important qualities. A passion for learning new technologies and enhancing personal skills will set you apart as a valuable team member.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Technical Manager, you will lead and manage a team of software engineers to ensure high performance and quality delivery. Your responsibilities will include designing, developing, and maintaining scalable and robust Python applications. You will architect and implement cloud solutions on AWS and Azure, adhering to best practices in security, scalability, and cost-efficiency. Collaborating with cross-functional teams, you will define, design, and ship new features while mentoring and guiding team members in their technical and professional growth. In this role, you will implement DevOps practices to streamline CI/CD pipelines and automate deployment processes. Developing and maintaining APIs using FastAPI and GraphQL will be part of your tasks, ensuring that the team follows best practices in coding, testing, and documentation. You will also oversee database design, optimization, and maintenance, driving productivity within the team by implementing efficient workflows and leveraging code assist tools. To be successful in this position, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proven experience as a Technical Manager or in a similar role, leading large teams, is required. Strong proficiency in Python programming, extensive experience with AWS or Azure cloud services, and a solid understanding of DevOps practices and tools are essential. Experience with FastAPI and GraphQL, database systems (SQL and NoSQL), design patterns, and microservices architecture is also necessary. Additionally, familiarity with MLOps for deploying and managing machine learning models, LLMOps for large language model operations, Databricks for big data processing and analytics, and excellent problem-solving skills with attention to detail are desired. Strong communication and leadership skills, along with the ability to drive productivity and enhance team efficiency using code assist tools, will be valuable assets in this role.,

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Company Description NielsenIQ is a consumer intelligence company that delivers the Full View™, the world’s most complete and clear understanding of consumer buying behavior that reveals new pathways to growth. Since 1923, NIQ has moved measurement forward for industries and economies across the globe. We are putting the brightest and most dedicated minds together to accelerate progress. Our diversity brings out the best in each other so we can leave a lasting legacy on the work that we do and the people that we do it with. NielsenIQ offers a range of products and services that leverage Machine Learning and Artificial Intelligence to provide insights into consumer behavior and market trends. This position opens the opportunity to apply the latest state of the art in AI/ML and data science to global and key strategic projects. Job Description NielsenIQ’s Innovation Team is growing our AI capabilities and is now looking to hire an AI/ML Data Scientist in India (Pune). for the Core Models team, a multidisciplinary team of researchers working on different areas of AI such as recommender systems, extreme classifiers, Large Language Models (LLMs), among others. As part of this team, you will stay up to date with the latest research in AI (with special focus on NLP, but also on Computer Vision and other AI related fields), implement current state-of-the-art algorithms in real-world and large-scale challenging problems as well as proposing novel ideas. Your main focus will be creating high-quality datasets for training and fine-tuning Custom Models for the company, LLMs and Recommender Systems, and training them to analyze the impact of the different versions of the data on model’s performance. The selected candidate will be responsible for designing and implementing scalable data pipelines and strategies to support all stages of the R&D process, e.g., fine-tuning or alignment through reinforcement learning. The results of the word will be critical to ensure the robustness, safety, and alignment of our AI models. You will also have the opportunity to produce scientific content such as patents or conference/journal papers. Job Responsibilities: Investigate, develop, and apply data pipelines with minimal technical supervision, always ensuring a combination of simplicity, scalability, reproducibility and maintainability within the ML solutions and source code. Train Deep Learning models (Transformer models) and analyze the impact of different versions of the data. Perform feasibility studies and analyze data to determine the most appropriate solution. Drive innovation and proactively contribute to our work on custom Large Language Models. Be able to communicate results to tech and non-tech audiences. To work as a member of a team, encouraging team building, motivation and cultivating effective team relations. Qualifications Required Education, Skills and Experience: Master's degree in computer science or an equivalent numerate discipline. At least 5+ years’ experience with evidence in a related field. Strong background in computer science, linear algebra, probability. Solid experience in Machine Learning and Deep Learning (special focus on Transformers). Proven experience in Natural Language Processing and Large Language Models. Proven experience building scalable data pipelines and ETLs. Able to understand scientific papers and develop ideas into executable code. Proven track record of innovation in creating novel algorithms and publishing the results in AI conferences/journals. Languages and technologies: Python, SQL, PySpark, Databricks, Pandas/Polars, PyArrow, PyTorch, Huggingface, git. Proactive attitude, constructive, intellectual curiosity, and persistence to find answers to questions. A proficient level of interpersonal and communication skills (English level B2 minimum). Keen to work as part of a diverse team of international colleagues and in a global inclusive culture. Additional Information: Preferred Education, Skills and Experience: PhD in science (NLP/Data Science is preferred) and specialized courses in one of the above-mentioned fields. Experience working with large real-world datasets and scalable ML solutions. Previous experience in e-commerce, retail and/or FMCG/Consumer Insight business. Agile methodologies development (SCRUM or Scale Agile). Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Forward Deployed Engineer at Salesforce, you will play a crucial role in delivering transformative AI solutions to our strategic clients. Your responsibilities will include leading the design, development, and implementation of bespoke solutions using cutting-edge technologies like the Agentforce platform. You will be at the forefront of driving technical vision, mentoring team members, and ensuring the successful delivery of mission-critical AI applications in real-world environments. Your impact will be significant as you lead the architectural design of scalable production systems, strategize complex data ecosystems, drive innovation on the Agentforce platform, and operate with a proactive and strategic mindset. Building strong relationships with senior client teams, ensuring seamless deployment, and optimizing solutions for long-term reliability will be key aspects of your role. Additionally, you will act as a bridge between customer needs and product evolution, providing valuable feedback to shape future enhancements. To excel in this role, you are required to have a Bachelor's degree in Computer Science or a related field, with 5+ years of experience in delivering scalable production solutions. Proficiency in programming languages like JavaScript, Java, Python, and expertise in AI technologies are essential. Strong communication skills, a proactive attitude, and the ability to travel as needed are also important qualifications. Preferred qualifications include expert-level experience with Salesforce Data Cloud and the Agentforce platform, as well as knowledge of Salesforce CRM across various clouds. Experience in developing complex conversational AI solutions and Salesforce platform certifications would be advantageous. If you are passionate about leveraging AI to drive business transformation and have a track record of impactful delivery in agile environments, this role offers a unique opportunity to make a difference.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

Embark on a transformative journey as a Data Scientist AI/ML - AVP at Barclays in the Group Control Quantitative Analytics team, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. Group Control Quantitative Analytics (GCQA) is a global organization of highly specialized data scientists working on Artificial Intelligence, Machine Learning, and Gen AI model development and model management including governance and monitoring. GCQA is led by Remi Cuchillo under Lee Gregory, who is Chief Data and Analytics Officer (CDAO) in Group Control. GCQA is responsible for developing and managing AI/ML/GenAI models (including governance and regular model monitoring) and providing analytical support across different areas including Fraud, Financial Crime, Customer Due Diligence, Controls, Security, etc. within Barclays. The Data Scientist position provides project-specific leadership in building targeting solutions that integrate effectively into existing systems and processes while delivering strong and consistent performance. Working with GC CDAO team, the Quantitative Analytics Data Scientist role provides expertise in project design, predictive model development, validation, monitoring, tracking, and implementation. To be successful in this role, you should possess the following skillsets: Python Programming. Knowledge of Artificial Intelligence and Machine Learning algorithms including NLP. SQL. Spark/PySpark. Predictive Model development. Model lifecycle and model management including monitoring, governance, and implementation. DevOps tools like Git/Bitbucket etc. Project management using JIRA. Some other highly valued skills include: DevOps tools TeamCity, Jenkins, etc. Knowledge of Financial/Banking Domain. Knowledge of GenAI tools and working. AWS. Databricks. You may be assessed on the key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based in our Noida office. Purpose of the role To design, develop, implement, and support mathematical, statistical, and machine learning models and analytics used in business decision-making. Accountabilities Design analytics and modeling solutions to complex business problems using domain expertise. Collaboration with technology to specify any dependencies required for analytical solutions, such as data, development environments, and tools. Development of high performing, comprehensively documented analytics and modeling solutions, demonstrating their efficacy to business users and independent validation teams. Implementation of analytics and models in accurate, stable, well-tested software and work with technology to operationalize them. Provision of ongoing support for the continued effectiveness of analytics and modeling solutions to users. Demonstrate conformance to all Barclays Enterprise Risk Management Policies, particularly Model Risk Policy. Ensure all development activities are undertaken within the defined control environment. Assistant Vice President Expectations To advise and influence decision-making, contribute to policy development, and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well-developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialization to complete assignments. They will identify new directions for assignments and/or projects, identifying a combination of cross-functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires an understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Collaborate with other areas of work, for business-aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practices (in other areas, teams, companies, etc.) to solve problems creatively and effectively. Communicate complex information. "Complex" information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

Alphanext is a global talent solutions company with offices in London, Pune, and Indore. We connect top-tier technical talent with forward-thinking organizations to drive innovation and transformation through technology. We are seeking a Senior Data Integration Engineer to take charge of designing, building, and governing scalable, high-performance data pipelines across enterprise systems. The ideal candidate will have extensive experience in data engineering and integration, particularly within manufacturing, retail, and supply chain ecosystems. This role plays a crucial part in ensuring near-real-time data flows, robust data quality, and seamless integration among ERP, WMS, commerce, and finance platforms, thereby enabling AI and analytics capabilities throughout the enterprise. Key Responsibilities: - Designing and maintaining ELT/ETL pipelines that integrate systems such as BlueCherry ERP, Manhattan WMS, and Shopify Plus. - Developing event-driven architectures utilizing Azure Service Bus, Kafka, or Event Hubs for real-time data streaming. - Defining and publishing data contracts and schemas (JSON/Avro) in the enterprise Data Catalog to ensure lineage and governance. - Automating reconciliation processes with workflows that detect discrepancies, raise alerts, and monitor data-quality SLAs. - Leading code reviews, establishing integration playbooks, and providing guidance to onshore/offshore engineering teams. - Collaborating with the Cybersecurity team to implement encryption, PII masking, and audit-compliant data flows. - Facilitating AI and analytics pipelines, including feeds for feature stores and streaming ingestion to support demand forecasting and GenAI use cases. Year-One Deliverables: - Replacement of the existing nightly CSV-based exchange between BlueCherry and WMS with a near-real-time event bus integration. - Launching a unified product master API that feeds PLM, OMS, and e-commerce within 6 months. - Automating three-way reconciliation of PO packing list warehouse receipt to support traceability audits (e.g., BCI cotton). - Deployment of a data quality dashboard with rule-based alerts and SLA tracking metrics. Must-Have Technical Skills: - 5+ years of experience in data engineering or integration-focused roles. - Proficiency with at least two of the following: Azure Data Factory, Databricks, Kafka/Event Hubs, DBT, SQL Server, Logic Apps, Python. - Strong SQL skills and experience with a compiled or scripting language (Python, C#, or Java). - Proven track record of integrating ERP, WMS, PLM, or similar retail/manufacturing systems. - Expertise in data modeling, schema design (JSON/Avro), and schema versioning. - Working knowledge of CI/CD pipelines and infrastructure-as-code using tools like GitHub Actions and Azure DevOps. Qualifications: - Bachelor's degree in Computer Science, Engineering, or a related field (preferred). - Exceptional problem-solving skills, analytical mindset, and attention to data governance. - Strong communication and leadership abilities, with a history of mentoring and collaborating with teams.,

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies