Jobs
Interviews

541 Data Manipulation Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 4.0 years

5 - 6 Lacs

Pune, Bengaluru

Work from Office

Sr. Analyst - Marketing Measurement & Optimization Job Description: Qualifications: Bachelors degree in Statistics, Mathematics, Computer Science, Engineering, or a related field. Proven 3-4 years of experience in a similar role. Strong analytical and problem-solving skills. Excellent communication and presentation skills. Skills: Proficiency in R (tidyverse, LME4/lmerTest, plotly/ggplot2), or Python, for data manipulation and modelling and visualization, and SQL (joins, aggregation, analytics functions) for data handling. Ability to handle & analyse marketing data and perform statistical tests. Experience with data visualization tools such as Tableau, PowerPoint, Excel. Strong storytelling skills and the ability to generate insights & recommendations. Responsibilities: Understand business requirements and suggest appropriate marketing measurement solutions (Media Mix Modelling, Multi Touch Attribution, etc.) Conduct panel data analysis using fixed effects, random effects, and mixed effects models. Perform econometric modelling, including model evaluation, model selection, and results interpretation. Understand, execute, and evaluate the data science modelling flow. Understand marketing, its objectives, and effectiveness measures such as ROI/ROAS. Familiarity with marketing channels, performance metrics, and conversion funnel. Experience with media mix modelling, ad-stock effect, saturation effect, multi-touch attribution, rule-based attribution, and media mix optimization. Knowledge of Bayes theorem, Shapley value, Markov chain, response curve, marginal ROI, halo effect, and cannibalization. Experience handling marketing data and performing data QA & manipulation tasks such as joins/merge, aggregation & segregation, append. Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 months ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Pune, Bengaluru

Work from Office

Sr. Analyst - Marketing Measurement & Optimization Job Description: Qualifications: Bachelors degree in Statistics, Mathematics, Computer Science, Engineering, or a related field. Proven 3-4 years of experience in a similar role. Strong analytical and problem-solving skills. Excellent communication and presentation skills. Skills: Proficiency in R (tidyverse, LME4/lmerTest, plotly/ggplot2), or Python, for data manipulation and modelling and visualization, and SQL (joins, aggregation, analytics functions) for data handling. Ability to handle & analyse marketing data and perform statistical tests. Experience with data visualization tools such as Tableau, PowerPoint, Excel. Strong storytelling skills and the ability to generate insights & recommendations. Responsibilities: Understand business requirements and suggest appropriate marketing measurement solutions (Media Mix Modelling, Multi Touch Attribution, etc.) Conduct panel data analysis using fixed effects, random effects, and mixed effects models. Perform econometric modelling, including model evaluation, model selection, and results interpretation. Understand, execute, and evaluate the data science modelling flow. Understand marketing, its objectives, and effectiveness measures such as ROI/ROAS. Familiarity with marketing channels, performance metrics, and conversion funnel. Experience with media mix modelling, ad-stock effect, saturation effect, multi-touch attribution, rule-based attribution, and media mix optimization. Knowledge of Bayes theorem, Shapley value, Markov chain, response curve, marginal ROI, halo effect, and cannibalization. Experience handling marketing data and performing data QA & manipulation tasks such as joins/merge, aggregation & segregation, append. Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 months ago

Apply

4.0 - 8.0 years

3 - 8 Lacs

Thane, Navi Mumbai, Mumbai (All Areas)

Hybrid

Responsible for worldwide data quality in our operations management system Create, manage, support reports & dashboards as per user requests Identify, reconcile, & correct data discrepancies impacting global reporting Support offices that use our reporting tools Responsible for security products supply chain management globally Ensure security products are correctly used & tracked by commercial offices Monitor stocks & ensure production supply (batches) in partnership with supplier Control & validate suppliers invoicing

Posted 2 months ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Kolkata

Work from Office

Azure devops, AWS – (Major AWS services like RDS-MSQL, Elastic beanstalk,) EKS , Terraform, Python, Kubernetes, SRE ( Man) Job Summary We are seeking an experienced Infra. Technology Specialist with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in SRE Automation Database and SQL Database Basics Terraform AWS and Python. This hybrid role involves rotational shifts and does not require travel. The candidate will play a crucial role in ensuring the stability and efficiency of our infrastructure. Responsibilities Oversee the stability and performance of the companys infrastructure. Implement automation solutions to streamline operations and reduce manual tasks. Manage and optimize databases ensuring data integrity and availability. Utilize SQL to perform complex queries and data analysis. Apply Terraform to manage infrastructure as code and automate provisioning. Leverage AWS services to build and maintain scalable and secure cloud environments. Develop and maintain scripts using Python to automate routine tasks and processes. Monitor system performance and troubleshoot issues to ensure high availability. Collaborate with cross-functional teams to design and implement infrastructure solutions. Provide technical support and guidance to team members on infrastructure-related matters. Ensure compliance with security policies and best practices in all infrastructure activities. Participate in on-call rotations to provide 24/7 support for critical infrastructure components. Continuously evaluate and adopt new technologies to improve infrastructure efficiency. Qualifications Possess a strong background in Site Reliability Engineering (SRE) with hands-on experience. Demonstrate proficiency in automation tools and techniques to enhance operational efficiency. Have in-depth knowledge of database management and SQL for data manipulation and analysis. Show expertise in using Terraform for infrastructure as code and automated provisioning. Exhibit experience with AWS cloud services for building and maintaining cloud environments. Be skilled in Python programming for scripting and automation purposes. Display excellent problem-solving abilities and attention to detail. Have strong communication skills to collaborate effectively with team members. Be adaptable to rotational shifts and able to work in a hybrid work model. Maintain a proactive approach to learning and adopting new technologies. Ensure a high level of security and compliance in all infrastructure activities. Provide mentorship and support to junior team members. Demonstrate a commitment to continuous improvement and innovation.

Posted 2 months ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Kolkata

Work from Office

Azure DeVos, AWS – (Major AWS services like RDS-MSQL, Elastic beanstalk,) EKS , Terraform, Python, Kubernetes, SRE Job Summary We are seeking an experienced Infra. Technology Specialist with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in SRE Automation Database and SQL Database Basics Terraform AWS and Python. This hybrid role involves rotational shifts and does not require travel. The candidate will play a crucial role in ensuring the stability and efficiency of our infrastructure. Responsibilities Oversee the stability and performance of the companys infrastructure. Implement automation solutions to streamline operations and reduce manual tasks. Manage and optimize databases ensuring data integrity and availability. Utilize SQL to perform complex queries and data analysis. Apply Terraform to manage infrastructure as code and automate provisioning. Leverage AWS services to build and maintain scalable and secure cloud environments. Develop and maintain scripts using Python to automate routine tasks and processes. Monitor system performance and troubleshoot issues to ensure high availability. Collaborate with cross-functional teams to design and implement infrastructure solutions. Provide technical support and guidance to team members on infrastructure-related matters. Ensure compliance with security policies and best practices in all infrastructure activities. Participate in on-call rotations to provide 24/7 support for critical infrastructure components. Continuously evaluate and adopt new technologies to improve infrastructure efficiency. Qualifications Possess a strong background in Site Reliability Engineering (SRE) with hands-on experience. Demonstrate proficiency in automation tools and techniques to enhance operational efficiency. Have in-depth knowledge of database management and SQL for data manipulation and analysis. Show expertise in using Terraform for infrastructure as code and automated provisioning. Exhibit experience with AWS cloud services for building and maintaining cloud environments. Be skilled in Python programming for scripting and automation purposes. Display excellent problem-solving abilities and attention to detail. Have strong communication skills to collaborate effectively with team members. Be adaptable to rotational shifts and able to work in a hybrid work model. Maintain a proactive approach to learning and adopting new technologies. Ensure a high level of security and compliance in all infrastructure activities. Provide mentorship and support to junior team members. Demonstrate a commitment to continuous improvement and innovation.

Posted 2 months ago

Apply

2.0 - 3.0 years

7 - 11 Lacs

Ahmedabad

Work from Office

Skills and requirements Proficiency in Python and experience with libraries such as NumPy, Pandas, Matplotlib, and Seaborn for data manipulation and visualization. Understanding of core machine learning algorithms, including linear/logistic regression, decision trees, random forests, SVM, KNN, K-Means, and neural networks. Experience with ML frameworks such as Scikit-learn, TensorFlow, and PyTorch for building and training models. Ability to preprocess data, including handling missing values, normalization, encoding categorical variables, and feature engineering. Knowledge of model evaluation techniques such as accuracy, precision, recall, F1-score, confusion matrix, and cross-validation. Mathematical foundation in linear algebra, probability, statistics, and basic calculus to understand and fine-tune models. Version control skills using Git/GitHub and familiarity with Jupyter Notebooks or similar development environments. Basic understanding of deep learning, computer vision, or NLP concepts (optional but preferred). Please attach your CV Upload File Here (Max. file size: 2 MB) By using this form you agree with the storage and handling of your data by this website. *

Posted 2 months ago

Apply

8.0 - 13.0 years

6 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Sponsor-dedicated: Working fully embedded within one of our pharmaceutical clients, with the support of Cytel right behind you, youll be at the heart of our clients innovation. As a Senior Statistical Programmer you will be dedicated to one of our global pharmaceutical clients; a company that is driving the next generation of patient treatment, where individuals are empowered to work with autonomy and ownership. This is an exciting time to be a part of this new program. Position Overview: As a Senior Statistical Programmer, you will leverage your advanced SAS programming skills and proficiency in CDISC standards (SDTM ADaM) to support or lead one or more Phase I-IV clinical trials. This role can be performed as fully remote. Our values We believe in applying scientific rigor to reveal the full promise inherent in data. We nurture intellectual curiosity and encourage everyone to approach new challenges with enthusiasm and the desire for discovery. We believe in collaboration and invite a diversity of perspectives, drawing on a variety of talents to create a wealth of possibilities. We prize innovation and seek intelligent solutions using leading-edge technology. Here at Cytel we want our employees to succeed and we enable this success through consistent training, development and support. To be successful in this position you will have: Bachelor s degree in one of the following fields Statistics, Computer Science, Mathematics, etc. At least 8 years of SAS programming working with clinical trial data in the Pharmaceutical Biotech industry with a bachelor s degree or equivalent. At least 6 years of related experience with a master s degree or above. Study lead experience, preferably juggling multiple projects simultaneously preferred. Strong SAS data manipulation, analysis and reporting skills. Solid experience implementing the latest CDISC SDTM / ADaM standards. Strong QC / validation skills. Good ad-hoc reporting skills. Proficiency in Efficacy analysis. Familiarity with drug development life cycle and experience with the manipulation, analysis and reporting of clinical trials data. Submissions experience utilizing define.xml and other submission documents. Experience supporting immunology, respiratory or oncology studies would be a plus. Excellent analytical troubleshooting skills. Ability to provide quality output and deliverables, in adherence with challenging timelines. Ability to work effectively and successfully in a globally dispersed team environment with cross-cultural partners. How you will contribute: Performing data manipulation, analysis and reporting of clinical trial data, both safety and efficacy (ISS/ISE), utilizing SAS programming Generating and validating SDTM and ADaM datasets/analysis files, and tables, listings, and figures (TLFs) Production and QC / validation programming Generating complex ad-hoc reports utilizing raw data Applying strong understanding/experience of Efficacy analysis Creating and reviewing submission documents and eCRTs Communicating with and/or responding to internal cross-functional teams and client for project specifications, status, issues or inquiries Performing lead duties when called upon Serving as team player, with a willingness to go the extra distance to get results, meet deadlines, etc. Being adaptable and flexible when priorities change

Posted 2 months ago

Apply

8.0 - 13.0 years

12 - 13 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Plans, conducts and leads assignments generally involving moderate, high budgets projects or more than one project. Manages user expectations regarding appropriate milestones and deadlines. Assists in training, work assignment and checking of less experienced developers. Serves as technical consultant to leaders in the IT organization and functional user groups. Subject matter expert in one or more technical programming specialties; employs expertise as a generalist of a specialist. Performs estimation efforts on complex projects and tracks progress. Works on the highest level of problems where analysis of situations or data requires an in-depth evaluation of various factors. Documents, evaluates and researches test results; documents evolution of testing scripts for future replication. Identifies, recommends and implements changes to enhance the effectiveness of quality assurance strategies. Description Comments Additional Details Description Comments : PowerBI Developer -Skills: Power BI, Python, PySpark and SQL 8+ years of experience in Power BI Proficiency in designing data models that support reports and dashboards. Mastery of DAX for calculations and data manipulation. Skills in transforming raw data into a format suitable for analysis. Strong knowledge of SQL for querying databases. Ability to create visually appealing and interactive reports and dashboards. Proficiency in PySpark and Spark scripting Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Design, implement, and optimize reporting solutions using Power BI and SQL Utilize PySpark and Spark scripting for data processing and analysis Develop and maintain Power BI reports and dashboards. Analyze data to identify trends, patterns, and insights. Ensure data accuracy and consistency across reports. Not to Exceed Rate : (No Value)

Posted 2 months ago

Apply

3.0 - 4.0 years

12 - 14 Lacs

Chennai, Gurugram

Work from Office

Would you like to contribute to a team that creates impactful software solutions for diverse users Are you enthusiastic, motivated, and eager to learn and grow in your career About Our Team LexisNexis Legal Professional serves customers in over 150 countries with 11,800 employees worldwide and is part of RELX, a global company providing information-based analytics and decision tools for professional and business customers. Our organization prioritizes responsible AI and advanced technologies to improve productivity and transform industries, including tools tailored for the legal sector. We believe in fostering innovation while maintaining ethical standards that benefit all stakeholders. About the Role This entry-level position offers the chance to learn and apply software development skills within a supportive environment. You will contribute to projects that make a real difference, with the guidance of experienced professionals. Responsibilities Collaborate with team members to develop components of software systems, focusing on simple and effective solutions. Contribute to fixing bugs and optimizing software functionality. Participate in development processes, code reviews, and best practices under the mentorship of senior engineers. Work in diverse development environments (e.g., Agile, Waterfall) while engaging with key stakeholders. Stay updated on emerging technologies and trends. Requirements We understand that talent comes from many backgrounds. If you meet the following criteria, we encourage you to apply Basic understanding of software development methodologies (e.g., Agile or Waterfall). Familiarity with data manipulation languages and data modeling principles. Knowledge of programming languages, such as Python, and interest in AI/ML technologies. Willingness to learn and adapt to new processes and technologies. Strong ability to collaborate and communicate effectively. Working in a Way that Works for You We believe in flexible working arrangements to ensure a healthy work-life balance. Whether youre seeking professional development, support for personal responsibilities, or long-term goals, we are here to help. Working for You We strive to create an environment where everyone feels valued and supported. Here are some of the benefits we offer Comprehensive Health Insurance Coverage extends to your family. Enhanced Health Insurance Options Competitive rates secured by the company. Group Life and Accident Insurance Financial security and protection. Flexible Working Arrangement Balance your work and personal life effectively. Employee Assistance Program Access to personal and work-related support services. Medical Screening Promoting your health and well-being. Family Benefits Inclusive support for maternity, paternity, and adoption. Recognition Programs Celebrate milestones and achievements. Paid Time Off Various leave options to meet diverse needs. About the Business LexisNexis Legal Professional provides legal, regulatory, and business information and analytics that enhance productivity, decision-making, and outcomes worldwide. As a digital pioneer, LexisNexis was the first to bring legal and business information online with its Lexis and Nexis services. LexisNexis, a division of RELX, is an equal opportunity employer qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form https / / forms.office.com / r / eVgFxjLmAK , or please contact 1-855-833-5120. Please read our Candidate Privacy Policy .

Posted 2 months ago

Apply

4.0 - 7.0 years

8 - 12 Lacs

Chennai, Guindy

Work from Office

Data Scientist Chennai - Guindy, India Information Technology 17006 Overview Data Scientist job description typically involvescollecting, cleaning, analyzing, and interpreting large datasets to uncover insights and build predictive models.He should use data visualization tools to present findings, make recommendations, and communicate their work to stakeholders. Responsibilities Data Collection and Preparation: Gather data from various sources, clean and preprocess it, and transform it into a usable format for analysis. Data Analysis: Conduct exploratory data analysis (EDA), identify patterns and trends, and use statistical methods to extract meaningful insights. Model Building: Develop and implement machine learning algorithms and predictive models to solve business problems. Data Visualization: Create clear and informative visualizations to present findings and communicate insights to stakeholders. Communication: Clearly communicate findings, recommendations, and the implications of data-driven decisions to both technical and non-technical audiences. Problem Solving: Identify business problems, develop data-driven solutions, and evaluate the effectiveness of proposed solutions. Requirements Programming: Proficiency in languages like Python and R, as well as familiarity with data manipulation libraries (e.g., Pandas, NumPy). Statistics and Probability: A strong foundation in statistical concepts, probability distributions, and hypothesis testing. Machine Learning: Knowledge of various machine learning algorithms, model selection, and evaluation techniques.

Posted 2 months ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Job Description: Reltio(MDM,Java and Python ) Experience: 7+ years of experience working with Reltio MDM in a professional setting Technical Skills: Strong Understanding of Master Data Management principals and concepts Design, configure, and manage the Reltio Data Model, including match & merge rules, survivorship rules, Validation rules Manage Reference Data Management (RDM), User management, UI config, handling lifecycle actions and workflow Develop and optimize data loading/exporting process into/from Reltio Work with Reltio Integration Hub to ensure seamless data integration Strong proficiency in SQL for data manipulation and querying Knowledge of Java/Python or any programming scripting language for data processing and automation Familiarity with Data Modelling concepts Understanding of MDM workflow configurations and role-based data governance Soft Skills: Excellent analytical and problem-solving skills with a keen attention to detail Strong ability to communicate effectively with both technical and non-technical stakeholders Proven ability to work independently and collaborate in a fast-paced environment

Posted 2 months ago

Apply

15.0 - 20.0 years

13 - 17 Lacs

Gurugram

Work from Office

Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Identity and Access Management (IAM) Operations Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Architect, you will define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Your typical day will involve collaborating with various teams to assess security needs, documenting security controls, and transitioning to cloud security-managed operations, all while ensuring compliance with industry standards and best practices. Skilled professional who designs, develops, and maintains business intelligence solutions using both Power BI and Excel. They are responsible for transforming raw data into actionable insights through reports, dashboards, and visualizations. Their work involves connecting to various data sources, building data models, and ensuring data accuracy and security Roles & Responsibilities:-Developing and maintaining data models:-Designing and implementing data models for Power BI, ensuring data integrity and accuracy. -Creating reports and dashboards:-Building interactive and visually appealing dashboards and reports using Power BI to present data insights. -Data analysis and visualization:-Analyzing data to identify trends, patterns, and insights, and visualizing them effectively using Power BI. -Gathering requirements from stakeholders:-Collaborating with business stakeholders to understand their needs and translate them into technical requirements for Power BI solutions. -Ensuring data governance and compliance:-Implementing data security and access control measures, and ensuring compliance with data governance policies. -Optimizing Power BI solutions:-Troubleshooting, optimizing, and maintaining Power BI solutions for performance and scalability. -Using Excel for data manipulation and analysis:-Employing Excel for data cleansing, transformation, and ad-hoc analysis, potentially as a stepping stone to more complex Power BI solutions. -Staying updated with industry trends:-Keeping abreast of the latest trends and best practices in Power BI development and business intelligence. Professional & Technical Skills: -Proficiency with Power BI tools:Expertise in using Power BI Desktop, Power BI Service, and related tools for data modeling, reporting, and dashboard creation. -Strong analytical skills:Ability to analyze data, identify trends, and draw meaningful conclusions. -DAX knowledge:Familiarity with Data Analysis Expressions (DAX), the formula language used in Power BI for calculations and data modeling. -SQL proficiency:Knowledge of SQL for querying and manipulating data from various sources. -Excel expertise:Strong skills in using Excel for data manipulation, analysis, and reporting, potentially as a foundation for Power BI development. -Excellent communication skills:Ability to effectively communicate technical concepts to both technical and non-technical audiences. -Problem-solving skills:Ability to troubleshoot issues, identify root causes, and develop effective solutions. -Experience with data integration:Knowledge of connecting Power BI to various data sources, including databases, spreadsheets, and cloud services. Additional Information:- The candidate should have minimum 5 years of experience - This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

0.0 - 1.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Python Developer Location: Bengaluru, Karnataka, India Experience Level: 3–5 Years Employment Type: Full-Time Role Overview We are seeking a skilled Python Developer with a strong background in data manipulation and analysis using NumPy and Pandas, coupled with proficiency in SQL. The ideal candidate will have experience in building and optimizing data pipelines, ensuring efficient data processing and integration. Key Responsibilities Develop and maintain robust data pipelines and ETL processes using Python, NumPy, and Pandas. Write efficient SQL queries for data extraction, transformation, and loading. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Implement data validation and quality checks to ensure data integrity. Optimize existing codebases for performance and scalability. Document processes and maintain clear records of data workflows. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2–5 years of professional experience in Python development. Proficiency in NumPy and Pandas for data manipulation and analysis. Strong command of SQL and experience with relational databases like MySQL, PostgreSQL, or SQL Server. Familiarity with version control systems, particularly Git. Experience with data visualization tools and libraries is a plus. Preferred Skills Experience with data visualization libraries such as Matplotlib or Seaborn. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of big data tools and frameworks like Spark or Hadoop. Understanding of machine learning concepts and libraries. Why Join Enterprise Minds Enterprise Minds is a forward-thinking technology consulting firm dedicated to delivering next-generation solutions. By joining our team, you'll work on impactful projects, collaborate with industry experts, and contribute to innovative solutions that drive business transformation.

Posted 2 months ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Job TitleSenior Data Engineer/Developer Number of Positions2 : The Senior Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and building out new API integrations to support continuing increases in data volume and complexity. They will collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. Responsibilities: Design, construct, install, test and maintain highly scalable data management systems & Data Pipeline. Ensure systems meet business requirements and industry practices. Build high-performance algorithms, prototypes, predictive models, and proof of concepts. Research opportunities for data acquisition and new uses for existing data. Develop data set processes for data modeling, mining and production. Integrate new data management technologies and software engineering tools into existing structures. Create custom software components and analytics applications. Install and update disaster recovery procedures. Collaborate with data architects, modelers, and IT team members on project goals. Provide senior level technical consulting to peer data engineers during data application design and development for highly complex and critical data projects. Qualifications: Bachelor's degree in computer science, Engineering, or related field, or equivalent work experience. Proven 5-8 years of experience as a Senior Data Engineer or similar role. Experience with big data toolsHadoop, Spark, Kafka, Ansible, chef, Terraform, Airflow, and Protobuf RPC etc. Expert level SQL skills for data manipulation (DML) and validation (DB2). Experience with data pipeline and workflow management tools. Experience with object-oriented/object function scripting languagesPython, Java, Go lang etc. Strong problem solving and analytical skills. Excellent verbal communication skills. Good interpersonal skills. Ability to provide technical leadership for the team.

Posted 2 months ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Pune

Work from Office

Job Title Big Data Engineer About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job TitleBig Data Engineer : The Senior Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and building out new API integrations to support continuing increases in data volume and complexity. They will collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. Responsibilities: Design, construct, install, test and maintain highly scalable data management systems & Data Pipeline. Ensure systems meet business requirements and industry practices. Build high-performance algorithms, prototypes, predictive models, and proof of concepts. Research opportunities for data acquisition and new uses for existing data. Develop data set processes for data modeling, mining and production. Integrate new data management technologies and software engineering tools into existing structures. Create custom software components and analytics applications. Install and update disaster recovery procedures. Collaborate with data architects, modelers, and IT team members on project goals. Provide senior level technical consulting to peer data engineers during data application design and development for highly complex and critical data projects. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field, or equivalent work experience. Proven 5-8 years of experience as a Senior Data Engineer or similar role. Experience with big data toolsPyspark, Hadoop, Spark, Kafka, Ansible, chef, Terraform, Airflow, and Protobuf RPC etc.. Expert level SQL skills for data manipulation (DML) and validation (DB2). Experience with data pipeline and workflow management tools. Experience with object-oriented/object function scripting languagesPython, Java, Go lang etc. Strong problem solving and analytical skills. Excellent verbal communication skills. Good interpersonal skills. Ability to provide technical leadership for the team.

Posted 2 months ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Job TitleSenior Data Engineer/Developer Number of Positions2 : The Senior Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and building out new API integrations to support continuing increases in data volume and complexity. They will collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. Responsibilities: Design, construct, install, test and maintain highly scalable data management systems & Data Pipeline. Ensure systems meet business requirements and industry practices. Build high-performance algorithms, prototypes, predictive models, and proof of concepts. Research opportunities for data acquisition and new uses for existing data. Develop data set processes for data modeling, mining and production. Integrate new data management technologies and software engineering tools into existing structures. Create custom software components and analytics applications. Install and update disaster recovery procedures. Collaborate with data architects, modelers, and IT team members on project goals. Provide senior level technical consulting to peer data engineers during data application design and development for highly complex and critical data projects. Qualifications: Bachelor's degree in computer science, Engineering, or related field, or equivalent work experience. Proven 5-8 years of experience as a Senior Data Engineer or similar role. Experience with big data toolsHadoop, Spark, Kafka, Ansible, chef, Terraform, Airflow, and Protobuf RPC etc. Expert level SQL skills for data manipulation (DML) and validation (DB2). Experience with data pipeline and workflow management tools. Experience with object-oriented/object function scripting languagesPython, Java, Go lang etc. Strong problem solving and analytical skills. Excellent verbal communication skills. Good interpersonal skills. Ability to provide technical leadership for the team.

Posted 2 months ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Back About UsBCE Global Tech is a dynamic and innovative company dedicated to pushing the boundaries of technology We are on a mission to modernize global connectivity, one connection at a time Our goal is to build the highway to the future of communications, media, and entertainment, emerging as a powerhouse within the technology landscape in India We bring ambitions to life through design thinking that bridges the gaps between people, devices, and beyond, fostering unprecedented customer satisfaction through technology At BCE Global Tech, we are guided by our core values of innovation, customer-centricity, and a commitment to progress We harness cutting-edge technology to provide business outcomes with positive societal impact Our team of thought-leaders is pioneering advancements in 5G, MEC, IoT, and cloud-native architecture We offer continuous learning opportunities, innovative projects, and a collaborative work environment that empowers our employees to grow and succeed Responsibilities Lead the migration of data pipelines from Hadoop to Google Cloud Platform (GCP) Design, develop, and maintain data workflows using Airflow and custom flow solutions Implement infrastructure as code using Terraform Develop and optimize data processing applications using Java Spark or Python Spark Utilize Cloud Run and Cloud Functions for serverless computing Manage containerized applications using Docker Understand and enhance existing Hadoop pipelines Write and execute unit tests to ensure code quality Deploy data engineering solutions in production environments Craft and optimize SQL queries for data manipulation and analysis Requirements 7-8 years of experience in data engineering or related fields Proven experience with GCP migration from Hadoop pipelines Proficiency in Airflow and custom flow solutions Strong knowledge of Terraform for infrastructure management Expertise in Java Spark or Python Spark Experience With Cloud Run And Cloud Functions Experience with Data flow, DateProc and Cloud monitoring tools in GCP Familiarity with Docker for container management Solid understanding of Hadoop pipelines Ability to write and execute unit tests Experience with deployments in production environments Strong SQL query skills Skills Excellent teamwork and collaboration abilities Quick learner with a proactive attitude Strong problem-solving skills and attention to detail Ability to work independently and as part of a team Effective communication skills Why Join Us Opportunity to work with cutting-edge technologies Collaborative and supportive work environment Competitive salary and benefits Career growth and development opportunities

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Are you ready to play a key role in transforming Thomson Reuters into a truly data-driven companyJoin our Data & Analytics (D&A) function and be part of the strategic ambition to build, embed, and mature a data-driven culture across the entire organization. The Data Architecture organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the Role In this opportunity as a Data Architect, you will: Lead Architecture Design: Architect and Lead Data Platform EvolutionSpearhead the conceptual, logical, and physical architecture design for our enterprise Data Platform (encompassing areas like our data lake, data warehouse, streaming services, and master data management systems). You will define and enforce data modeling standards, data flow patterns, and integration strategies to serve a diverse audience from data engineers to AI/ML practitioners and BI analysts. Technical Standards and Best Practices: Research and recommend technical standards, ensuring the architecture aligns with overall technology and product strategy. Be hands-on in implementing core components reusable across applications. Hands-on Prototyping and Framework DevelopmentWhile a strategic role, maintain a hands-on approach by designing and implementing proof-of-concepts and core reusable components/frameworks for the data platform. This includes developing best practices and templates for data pipelines, particularly leveraging dbt for transformations, and ensuring efficient data processing and quality. Champion Data Ingestion StrategiesDesign and oversee the implementation of robust, scalable, and automated cloud data ingestion pipelines from a variety of sources (e.g., APIs, databases, streaming feeds) into our AWS-based data platform, utilizing services such as AWS Glue, Kinesis, Lambda, S3, and potentially third-party ETL/ELT tools. Design and optimize solutions utilizing our core cloud data stack, including deep expertise in Snowflake (e.g., architecture, performance tuning, security, data sharing, Snowpipe, Streams, Tasks) and a broad range of AWS data services (e.g., S3, Glue, EMR, Kinesis, Lambda, Redshift, DynamoDB, Athena, Step Functions, MWAA/Managed Airflow) to build and automate end-to-end analytics and data science workflows. Data-Driven Decision-Making: Make quick and effective data-driven decisions, demonstrating strong problem-solving and analytical skills. Align strategies with company goals. Stakeholder Collaboration: Collaborate closely with external and internal stakeholders, including business teams and product managers. Define roadmaps, understand functional requirements, and lead the team through the end-to-end development process. Team Collaboration: Work in a collaborative team-oriented environment, sharing information, diverse ideas, and partnering with cross-functional and remote teams. Quality and Continuous Improvement: Focus on quality, continuous improvement, and technical standards. Keep service focus on reliability, performance, and scalability while adhering to industry best practices. Technology Advancement: Continuously update yourself with next-generation technology and development tools. Contribute to process development practices. About You Youre a fit for the role of Data Architect, Data Platform if your background includes: Educational Background: Bachelors degree in information technology. Experience: 10 + years of IT experience with at least 5 years in a lead design or architectural capacity. Technical Expertise: Broad knowledge and experience with Cloud-native software design, Microservices architecture, Data Warehousing, and proficiency in Snowflake. Cloud and Data Skills: Experience with building and automating end-to-end analytics pipelines on AWS, familiarity with NoSQL databases. Data Pipeline and Ingestion MasteryExtensive experience in designing, building, and automating robust and scalable cloud data ingestion frameworks and end-to-end data pipelines on AWS. This includes experience with various ingestion patterns (batch, streaming, CDC) and tools. Data Modeling: Proficient with concepts of data modeling and data development lifecycle. Advanced Data ModelingDemonstrable expertise in designing and implementing various data models (e.g., relational, dimensional, Data Vault, NoSQL schemas) for transactional, analytical, and operational workloads. Strong understanding of the data development lifecycle, from requirements gathering to deployment and maintenance. Leadership: Proven ability to lead architectural discussions, influence technical direction, and mentor data engineers, effectively balancing complex technical decisions with user needs and overarching business constraints Programming Skills: Strong programming skills in languages such as Python or Java or data manipulation, automation, and API development. Regulatory Awareness and Security Acumen: Data Governance and Security AcumenDeep understanding and practical experience in designing and implementing solutions compliant with robust data governance principles, data security best practices (e.g., encryption, access controls, masking), and relevant privacy regulations (e.g., GDPR, CCPA). Containerization and Orchestration: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. #LI-VN1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 months ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

We are seeking a Senior Customer Data Analyst to join our Customer Data Audience Operations Team within the Marketing & Data organization at Thomson Reuters. Based in Hyderabad, India, this role will support our Customer Data Platform (CDP) operations, helping marketers leverage customer data effectively through audience segmentation and activation. About the Role In this role as a Senior Customer Data Analyst, you will Develop a comprehensive understanding of the CDP data structure, data models, tables, and available data types. Process and fulfill audience segmentation requests from marketing teams through our Workfront ticketing system. Create audience segments in Treasure Data CDP and push them to appropriate activation channels. Collaborate with marketing teams to understand their segmentation needs and provide data-driven solutions. Maintain documentation of segment creation processes and audience definitions. Monitor segment performance and provide recommendations for optimization. Stay current with AI capabilities within Treasure Datas AI Foundry to enhance segmentation strategies. Assist in troubleshooting data issues and ensuring data quality within segments. About You You’re a fit for the role of Senior Customer Data Analyst, if your background includes Bachelors degree in Computer Science, Information Technology, Engineering, Statistics, Mathematics, or related field. 6-8 years of experience in data analysis, data management, or related roles. Proficiency in SQL query writing and data manipulation. Basic understanding of marketing technology platforms (CDP, CRM, Marketing Automation). Ability to translate business requirements into technical specifications. Strong attention to detail and problem-solving skills. Excellent communication skills in English, both written and verbal. Experience with Treasure Data/Snowflake/Eloqua/Salesforce. Knowledge of AI/ML concepts and applications in marketing. Understanding of data privacy regulations and compliance requirements. Experience with data visualization tools. Basic programming skills (Python, R, etc.) Data analysis and interpretation. Familiarity with cloud-based data platforms. Understanding of relational databases. Microsoft Office Suite (especially Excel). Curiosity and eagerness to learn, Detail-oriented approach, Ability to work in a fast-paced environment, Team collaboration, Time management and prioritization abilities. Shift Timings2 PM to 11 PM (IST). Work from office for 2 days in a week (Mandatory) #LI-GS2 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 months ago

Apply

4.0 - 9.0 years

22 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Description- Workday Prism A role focused on designing, developing, and maintaining data visualizations, reports, and dashboards using Workdays native analytics tool, "Prism Analytics," to extract actionable insights from HR and business data, requiring strong understanding of Workday data structures, reporting capabilities, and data analysis skills to translate complex business needs into effective reports for stakeholders Expertise in Workday HCM modules (core HR, payroll, time tracking, etc.) and understanding of Workday data structures. Deep understanding of Prism features, including data blending, Calculated Fields, data visualization options, and reporting capabilities. Ability to interpret data, identify trends, and draw meaningful insights from complex datasets. Familiarity with SQL, data manipulation techniques, and potentially other data integration tools. Effectively communicate complex data findings to both technical and non-technical audiences. Job Description- Workday Prism A role focused on designing, developing, and maintaining data visualizations, reports, and dashboards using Workdays native analytics tool, "Prism Analytics," to extract actionable insights from HR and business data, requiring strong understanding of Workday data structures, reporting capabilities, and data analysis skills to translate complex business needs into effective reports for stakeholders Expertise in Workday HCM modules (core HR, payroll, time tracking, etc.) and understanding of Workday data structures. Deep understanding of Prism features, including data blending, Calculated Fields, data visualization options, and reporting capabilities. Ability to interpret data, identify trends, and draw meaningful insights from complex datasets. Familiarity with SQL, data manipulation techniques, and potentially other data integration tools. Effectively communicate complex data findings to both technical and non-technical audiences.

Posted 2 months ago

Apply

1.0 - 4.0 years

14 - 19 Lacs

Mumbai

Work from Office

Overview Development of Scalable and Efficient AI-Driven Solutions Create AI solutions that address complex business challenges. Ensure solutions are scalable to handle large data volumes and efficient in performance. Development and Maintenance of Index Data Platforms Build and sustain platforms that generate and manage Index data. Integrate data from various sources, maintain data accuracy, and ensure reliability. Development of Data Science-Based Solutions for Quality Check of Data Points Ensure the accuracy and reliability of delivered data through advanced data science techniques. Implement metrics, detect anomalies, and validate data points to maintain high data quality. Responsibilities 1. Development of Scalable and Efficient AI-Driven Solutions Create AI/ML solutions that address complex business challenges. Understand business use cases and identify where and how AI can add value. Design, train, and validate machine learning models tailored to specific business needs. Design solutions leveraging generative AI and LLM. Build cost effective,scalable and high quality data solutions which will be used for critical Index products Ensure solutions can handle large volumes of data and scale with business growth. Optimize models and algorithms for performance and resource utilization. 2. Development of AI and Data Science-Based Solutions for Quality Check of Data Points delivered to downstream Ensure the accuracy and reliability of data through advanced data science techniques. Define and implement metrics to assess data quality. Develop models to detect anomalies and inconsistencies in data. Create automated processes to validate data points against predefined standards. Continuously refine quality check methods based on feedback and new insights. Qualifications Artificial Intelligence (AI) and Machine Learning (ML) Understanding of machine learning algorithms, deep learning, neural networks, and reinforcement learning. Understanding and experience on Generative AI. Proficiency in working with large language models like GEMINI,GPT, BERT, and their variants. Experience in fine-tuning and deploying these models for various applications. Familiarity with frameworks like TensorFlow, PyTorch, and Keras. Programming Languages Python : Proficiency in Python for data manipulation, model development, and automation. Experience with libraries such as NumPy, pandas, scikit-learn, and matplotlib. Java (Good to have) : Strong skills in Java for building scalable and efficient applications. Knowledge of Java frameworks like Spring. Oracle (Good to have) : Expertise in Oracle databases, including SQL and PL/SQL. Experience in database design, optimization, and management. Data Enginering and Cloud Solutions Proficiency in cloud platforms like Azure or Google Cloud. Leveraging cloud services for flexibility, scalability, and cost-efficiency Experience with data warehousing solutions and other cloud based data storage solutions Experience in Kubernetes and service or event driven architectures What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 2 months ago

Apply

1.0 - 5.0 years

1 - 3 Lacs

Coimbatore

Work from Office

Strong proficiency with advance excel functions Experience handling large datasets into usable formats Strong problem-solving skills with excellent Communication skill Comfortable working in both individual and team settings with minimal Supervision

Posted 2 months ago

Apply

3.0 - 7.0 years

22 - 27 Lacs

Hyderabad

Work from Office

Entity :- Accenture Strategy & Consulting Team :- Strategy & Consulting- Global Network Practice :- Supply Chian Analytics Title :- Ind & Func AI Decision Science Consultant Job location :- Bangalore/Gurgaon/Hyderabad/ Mumbai Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse, and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Strategy and Consulting Candidate should have good understanding of statistics/analytical/optimization methods and approaches. The ideal candidate should be able to bring in meaningful data driven insights supporting with statistical concepts and apply the same in wider Supply Chain area. The candidate is expected to use data science skills to solve clients business problem in the supply chain area. The candidate should have functional skill set and Supply Chain domain knowledge with ability to apply data science skill set to solve supply chain problems. Additionally, the role would require contributing towards asset development initiatives. Mandatory Skills: Must have: Proficiency in data modeling developed through client projects. Extensive use of data-driven techniques including exploratory data analysis and data pre-processing to solve business problems. Proficient in using Python/PySpark programming for data manipulation, data visualization, and machine learning models with good hands-on experience. Proficiency in any one of the Cloud solutions- Azure, GCP or AWS Proficiency in SQL for data preparation, manipulation, and descriptive analysis Proficient in supply chain domain Excellent written and oral communication skills Good to have: Experience on Simulation and Optimization Visualization packages like Tableau/ Power BI Exposure to tools like BY/ Anaplan/ o9/ Kinaxis /SAP IBP Exposure to Client interaction Exposure to business platforms (o9/Kinaxis/BY) Qualifications Experience and Education: 5-6 years of experience in Machine Learning and/or Optimization Techniques. Master’s degree in technology or engineering or quantitative field (e.g. MSc in Statistics and Operations Research, M Tech in Industrial Engineering, Applied Math/Statistics, Computer Science) Certifications in any one or two of the areas will be an added advantage:Python, AI/ML, Optimization, Simulation, any of the cloud platforms (Azure/ GCP/ AWS).

Posted 2 months ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Gurugram

Work from Office

Entity :- Accenture Strategy & Consulting Team :- Global Network Data & AI Practice :- Supply Chian Analytics Title :- Ind & Func AI Decision Science Consultant Job location :- Bangalore/Gurgaon/Hyderabad/ Mumbai Accenture Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition As part of our Data & AI practice, you will join a worldwide network of smart and driven colleagues experienced in leading AI/ML/Statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse, and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Strategy and Consulting Candidate should have good understanding of statistics/analytical/optimization methods and approaches. The ideal candidate should be able to bring in meaningful data driven insights supporting with statistical concepts and apply the same in wider Supply Chain area. The candidate is expected to use data science skills to solve client's business problem in the supply chain area. The candidate should have functional skill set and Supply Chain domain knowledge with ability to apply data science skill set to solve supply chain problems. Additionally, the role would require contributing towards asset development initiatives. Qualifications Experience and Education: 5-6 years of experience in Machine Learning and/or Optimization Techniques. Master's degree in technology or engineering or quantitative field (e.g. MSc in Statistics and Operations Research, M Tech in Industrial Engineering, Applied Math/Statistics, Computer Science) Certifications in any one or two of the areas will be an added advantage: Python, AI/ML, Optimization, Simulation, any of the cloud platforms (Azure/ GCP/ AWS). Mandatory Skills: Must have: Proficiency in data modeling developed through client projects. Extensive use of data-driven techniques including exploratory data analysis and data pre-processing to solve business problems. Proficient in using Python/PySpark programming for data manipulation, data visualization, and machine learning models with good hands-on experience. Proficiency in any one of the Cloud solutions Azure, GCP or AWS Proficiency in SQL for data preparation, manipulation, and descriptive analysis Proficient in supply chain domain Excellent written and oral communication skills Good to have: Experience on Simulation and Optimization Visualization packages like Tableau/ Power BI Exposure to tools like BY/ Anaplan/ o9/ Kinaxis /SAP IBP Exposure to Client interaction Exposure to business platforms (o9/Kinaxis/BY)

Posted 2 months ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Delhi / NCR

Hybrid

8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies