Jobs
Interviews

1052 Etl Processes Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

jaipur, rajasthan

On-site

As a Data Engineer at our company, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure to ensure accurate, timely, and accessible data for our organization's growth and data-driven decision-making. Key Responsibilities: - Design, develop, and implement data pipelines using Azure Data Factory and Databricks for data ingestion, transformation, and movement. - Develop and optimize ETL processes to facilitate efficient data flow and transformation. - Maintain Azure Data Lake solutions for efficient storage and retrieval of large datasets. - Collaborate with Azure Synapse Analytics to build scalable data warehousing solutions for advanced analytics and reporting. - Integrate various data sources into MS-Fabric, ensuring data consistency, quality, and accessibility. - Optimize data processing workflows and storage solutions to enhance performance and reduce costs. - Manage and optimize SQL and NoSQL databases for high-performance queries and data storage. - Implement data quality checks and monitoring processes to ensure data accuracy and consistency. - Work closely with data scientists, analysts, and stakeholders to understand data requirements and deliver actionable insights. - Create and maintain comprehensive documentation for data processes, pipelines, infrastructure, architecture, and best practices. - Identify and resolve issues in data pipelines, data lakes, and warehousing solutions, providing timely support and maintenance. Qualifications: - 2-4 years of experience in data engineering or a related field. Technical Skills: - Proficiency in Azure Data Factory, Azure Synapse Analytics, Databricks, and Azure Data Lake. - Experience with Microsoft Fabric is a plus. - Strong SQL skills and familiarity with data warehousing concepts (DWH). - Knowledge of data modeling, ETL processes, and data integration. - Hands-on experience with ETL tools and frameworks like Apache Airflow and Talend. - Familiarity with big data technologies such as Hadoop and Spark. - Experience with cloud platforms like AWS, Azure, Google Cloud, and associated data services. - Familiarity with data visualization tools like Power BI and programming languages such as Python, Java, or Scala. - Experience with schema design and dimensional data modeling. Analytical Skills: - Strong problem-solving abilities and attention to detail. Communication: - Excellent verbal and written communication skills to explain technical concepts to non-technical stakeholders. Education: - Bachelor's degree in computer science, engineering, mathematics, or a related field. Advanced degrees or certifications are a plus. Interested candidates can share their CV at sulabh.tailang@celebaltech.com.,

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

hyderabad, telangana, india

On-site

Apollo Tele health Services is looking for Data Analyst to join our dynamic team and embark on a rewarding career journey. Responsibilities: Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

panna, madhya pradesh

On-site

Role Overview: You will be responsible for Test Data Management, Data Governance, and Data Engineering with a focus on Informatica Data Masking tools. Your role will involve working with Informatica TDM, ILM, and Persistent Data Masking to ensure data security and compliance standards are met. Additionally, you will be involved in data sub-setting, synthetic data generation, and data archiving using Informatica tools. Your expertise in RDBMS, complex SQL queries, ETL processes, data warehousing concepts, and data migration testing will be essential for this role. Familiarity with cloud platforms and data security in cloud environments will be advantageous. Key Responsibilities: - Utilize Informatica Data Masking tools (TDM, ILM) for Test Data Management - Implement static and dynamic data masking across RDBMS, files, and applications - Perform data sub-setting, synthetic data generation, cloning, and archiving using Informatica tools - Work with Informatica Developer, Metadata Manager, and Model Repository Service - Write complex SQL queries for RDBMS (Oracle, SQL Server, DB2, PostgreSQL, MySQL) - Ensure compliance with data privacy frameworks and standards - Understand ETL processes, data warehousing concepts, and data migration testing - Explore cloud platforms such as AWS, Azure, GCP for data security in cloud environments Qualifications Required: - 7+ years of experience in Test Data Management, Data Governance, or Data Engineering - 3+ years of hands-on experience with Informatica Data Masking tools (TDM, ILM) - Strong understanding of Informatica TDM, ILM, and Persistent Data Masking - Proficiency in Informatica Data Engineering Integration (DEI), Informatica Data Quality (IDQ), and PowerCenter - Knowledge of RDBMS (Oracle, SQL Server, DB2, PostgreSQL, MySQL) and ability to write complex SQL queries - Familiarity with cloud platforms (AWS, Azure, GCP) is a plus Note: The company mentioned in the job description is Wipro, a modern digital transformation partner with a focus on reinvention and empowerment. They encourage constant evolution and offer a purpose-driven work environment that supports individual reinvention and career growth. Applications from people with disabilities are explicitly welcome.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Role Overview: As a Senior Manager Product Tooling & Data (Data Focused) in the Group Investment Management (GIM) Team at London Stock Exchange Group (LSEG), your primary responsibility will be to interpret and utilize data science and analytics to support the team in predicting and managing programme execution risks. Your role will involve developing machine learning models and early warning indicators to identify and mitigate delivery risks across a multi-million-pound change portfolio. By transforming complex programme data into predictive insights, you will drive strategic investment decisions and contribute to fostering a culture of delivery excellence and transparency within the organization. Key Responsibilities: - Lead the design and implementation of machine learning models to predict programme delivery risks and financial overruns. - Integrate insights into existing dashboards and visualizations to effectively communicate with senior leadership and ensure actionable and trackable risk metrics. - Collaborate with senior stakeholders to understand data needs and translate them into technical solutions. - Conduct data cleaning, preprocessing, and integration from multiple sources, including internal databases and APIs. - Contribute to the development of Group-wide delivery framework and assurance standards. - Stay updated with advancements in AI/ML and apply them to enhance risk prediction and decision-making. - Support the annual investment planning process through data-driven analysis and scenario planning. Qualifications Required: - Proficiency in Python or R, with experience in libraries such as scikit-learn, TensorFlow, or similar. - Experience in LLM prompt engineering for feature extraction from unstructured data. - Strong SQL and data wrangling skills, including experience with ETL processes. - Solid understanding of statistical analysis and hypothesis testing. - Experience in building and deploying machine learning models in a business context. - Ability to communicate technical insights to non-technical audiences using visualization tools like Tableau and matplotlib. - Familiarity with fine-tuning small LLM models for automating topic extraction from unstructured data. - Knowledge of cloud platforms, especially AWS Bedrock and Snowflake data platform. - Experience with enterprise programme management tools like Clarity, Asana, and JIRA. - Familiarity with SAP ERP and Workday systems data. - Experience in financial services or programme delivery environments. - Strong communication and collaboration skills. - Self-starter with the ability to manage multiple priorities in a fast-paced environment. - Commitment to continuous learning and innovation. [Note: Additional details about the company's values and purpose have been omitted from the Job Description as they are not directly related to the role.] ,

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

Role Overview: As an Insurance Domain Business Analyst with experience in data projects at NTT DATA, you will be responsible for leading data migration projects within the organization. Your role will require a deep understanding of the insurance industry, data management principles, and hands-on experience in executing successful data migration initiatives. Key Responsibilities: - Policy Administration: - Handle quoting, rating, underwriting, policy issuance, endorsements, renewals, and cancellations. - Exposure to Tools: Guidewire PolicyCenter, Duck Creek - Work on policy admin Data components: Product models, customer attributes, risk factors - Billing Management: - Manage the entire billing and collections cycle. - Integrations worked on: ERP systems, financial reporting tools - Claims Management: - Support intake, triage, segmentation, loss assessment, and settlement. - Work on Technologies: NLP, OCR, ML for fraud detection and predictive analytics - Tools used: ClaimCenter, RPA bots, virtual chatbots - Reinsurance: - Strong functional exposure desired - Data & Analytics: - Support reporting, compliance, and operational insights, with a focus on troubleshooting the lineage. - Work on Components: Data lakes, warehouses, marts, cubes - Handle Use cases like Actuarial triangles, regulatory dashboards, KPI tracking - Integration & Ecosystem: - Connect core systems with third-party services and internal platforms. - Work with APIs for ADAS, OEMs, and external data sources - Exposure to Platforms: Unqork, Outsystems, SmartComm, ServiceNow Qualifications: - Bachelor's degree in information technology, Computer Science, or a related field; advanced degree preferred. - Minimum of 5-10 years of experience in the insurance domain with a focus on data migration projects. - Strong knowledge of insurance products, underwriting, claims, and regulatory requirements. - Proficiency in data migration tools and techniques, including experience in ETL processes. - Excellent analytical and problem-solving skills with a keen attention to detail. - Strong communication and presentation skills to interact with various stakeholders. Company Details: NTT DATA is a $30 billion trusted global innovator of business and technology services. Serving 75% of the Fortune Global 100, NTT DATA is committed to helping clients innovate, optimize, and transform for long-term success. With diverse experts in more than 50 countries and a robust partner ecosystem, NTT DATA offers services including business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. As one of the leading providers of digital and AI infrastructure globally, NTT DATA, a part of NTT Group, invests over $3.6 billion annually in R&D to support organizations and society in confidently moving into the digital future. Visit us at us.nttdata.com.,

Posted 4 days ago

Apply

10.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description: Senior ODI Developer (OCI PaaS/IaaS Expertise) Role Overview: We are seeking a highly skilled Senior ODI Developer with strong hands-on experience in SQL , PL/SQL , and Oracle Data Integrator (ODI) projects, particularly on OCI (Oracle Cloud Infrastructure) PaaS or IaaS platforms. The ideal candidate will design, implement, and optimize ETL processes, leveraging cloud-based solutions to meet evolving business needs. Prior experience in banking or insurance projects is a significant advantage. Skills and Qualifications: Mandatory Skills: Strong hands-on experience with Oracle Data Integrator (ODI) development and administration. Proficiency in SQL and PL/SQL for complex data manipulation and query optimization. Experience deploying and managing ODI solutions on OCI PaaS/IaaS environments. Deep understanding of ETL processes, data warehousing concepts, and cloud data integration. Preferred Experience: Hands-on experience in banking or insurance domain projects, with knowledge of domain-specific data structures. Familiarity with OCI services like Autonomous Database, Object Storage, Compute, and Networking. Experience in integrating on-premise and cloud-based data sources. Other Skills: Strong problem-solving and debugging skills. Excellent communication and teamwork abilities. Knowledge of Agile methodologies and cloud-based DevOps practices. Education and Experience: Bachelor's degree in computer science, Information Technology, or a related field. 10+ years of experience in ODI development, with at least 2 years of experience in OCI-based projects. Domain experience in banking or insurance is an added advantage. Key Responsibilities: Design, develop, and deploy ETL processes using Oracle Data Integrator (ODI) on OCI PaaS/IaaS. Configure and manage ODI instances on OCI, ensuring optimal performance and scalability. Develop and optimize complex SQL and PL/SQL scripts for data extraction, transformation, and loading. Implement data integration solutions, connecting diverse data sources like cloud databases, on-premise systems, APIs, and flat files. Monitor and troubleshoot ODI jobs running on OCI to ensure seamless data flow and resolve any issues promptly. Collaborate with data architects and business analysts to understand integration requirements and deliver robust solutions. Conduct performance tuning of ETL processes, SQL queries, and PL/SQL procedures. Prepare and maintain detailed technical documentation for developed solutions. Adhere to data security and compliance standards, particularly in cloud-based environments. Provide guidance and best practices for ODI and OCI-based data integration projects. Career Level - IC3

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description: DataStage Techno Functional Lead Role Overview: We are seeking a highly skilled DataStage Techno Functional Lead with strong hands-on experience in SQL , PL/SQL , and DataStage projects. The ideal candidate will design, implement, and optimize ETL processes to meet evolving business needs. Prior experience in FCCM or any other AML projects is a significant advantage. Skills and Qualifications: Mandatory Skills: Strong hands-on experience with DataStage development and administration. Proficiency in SQL and PL/SQL for complex data manipulation and query optimization. Experience deploying and managing DataStage solutions. Deep understanding of ETL processes, data warehousing concepts, and data integration. Domain experience in FCCM or any other AML based projects. Preferred Experience: Hands-on experience in banking domain projects, with knowledge of domain-specific data structures. Experience in integrating on-premise data sources. Other Skills: Strong problem-solving and debugging skills. Excellent communication and teamwork abilities. Knowledge of Agile methodologies and DevOps practices. Education and Experience: Bachelor's degree in computer science, Information Technology, or a related field. 6 to 10 years of experience in DataStage development-based projects. Key Responsibilities: Design, develop, and deploy ETL processes using DataStage Strong knowledge of Proficiency in developing and managing DataStage jobs, sequences, routines, and parameters. Develop and optimize complex SQL and PL/SQL scripts for data extraction, transformation, and loading. Implement data integration solutions, connecting diverse data sources on-premise systems, APIs, and flat files. Monitor and troubleshoot DataStage jobs to ensure seamless data flow and resolve any issues promptly. Collaborate with data architects and business analysts to understand integration requirements and deliver robust solutions. Conduct performance tuning of ETL processes, SQL queries, and PL/SQL procedures. Prepare and maintain detailed technical documentation for developed solutions. Adhere to data security and compliance standards. Provide guidance and best practices for DataStage projects. Career Level - IC3

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

vadodara, gujarat

On-site

At Rearc, we are dedicated to empowering engineers like you to create exceptional products and experiences by providing you with the best tools possible. We value individuals who think freely, challenge the norm, and embrace alternative problem-solving approaches. If you are driven by the desire to make a difference and solve complex problems, you'll feel right at home with us. As a Data Engineer at Rearc, you will be an integral part of our data engineering team, contributing to the optimization of data workflows for efficiency, scalability, and reliability. Your role will involve designing and implementing robust data solutions in collaboration with cross-functional teams to meet business objectives and uphold data management best practices. **Key Responsibilities:** - **Collaborate with Colleagues:** Work closely with team members to understand customers" data requirements and contribute to developing tailored data solutions. - **Apply DataOps Principles:** Utilize modern data engineering tools like Apache Airflow and Apache Spark to create scalable data pipelines and architectures. - **Support Data Engineering Projects:** Assist in managing and executing data engineering projects, providing technical support and ensuring project success. - **Promote Knowledge Sharing:** Contribute to the knowledge base through technical blogs and articles, advocating for best practices in data engineering and fostering a culture of continuous learning and innovation. **Qualifications Required:** - 2+ years of experience in data engineering, data architecture, or related fields. - Proven track record in contributing to complex data engineering projects and implementing scalable data solutions. - Hands-on experience with ETL processes, data warehousing, and data modeling tools. - Understanding of data integration tools and best practices. - Familiarity with cloud-based data services and technologies such as AWS Redshift, Azure Synapse Analytics, Google BigQuery. - Strong analytical skills for data-driven decision-making. - Proficiency in implementing and optimizing data pipelines using modern tools and frameworks. - Excellent communication and interpersonal skills for effective collaboration with teams and stakeholders. Your journey at Rearc will begin with an immersive learning experience to help you get acquainted with our processes. In the initial months, you will have the opportunity to explore various tools and technologies as you find your place within our team.,

Posted 4 days ago

Apply

2.0 - 11.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Database Developer Lead at Droisys, you will be responsible for designing, implementing, and maintaining database solutions to support the organization's data needs. Your expertise in database development, SQL programming, and data modeling will be crucial in this role. **Key Responsibilities:** - Design, develop, and maintain relational databases for various applications and systems. - Perform database tuning and optimization for optimal performance and scalability. - Develop SQL queries, stored procedures, and functions to support application development. - Collaborate with cross-functional teams to design database solutions meeting business needs. - Conduct data modeling for logical and physical database structures. - Implement and maintain database security measures to protect sensitive data. - Troubleshoot and resolve database issues promptly. - Document database design, processes, and procedures. **Qualifications:** - Bachelor's degree in Computer Science, Information Technology, or related field. - 2-11 years of experience in database development. - Proficiency in SQL programming and experience with relational database management systems (e.g., MySQL, PostgreSQL, SQL Server). - Strong understanding of database design principles and data modeling techniques. - Experience with database tuning, optimization, and performance monitoring. - Knowledge of database security best practices. - Excellent problem-solving skills and attention to detail. - Ability to work independently and collaboratively in a team environment. - Strong communication and interpersonal skills. **Preferred Qualifications:** - Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. - Familiarity with ETL processes and tools. - Knowledge of cloud-based database services (e.g., AWS RDS, Azure SQL Database). - Certification in database administration or development is a plus.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines and ETL processes using AWS and Snowflake. Your key responsibilities will include: - Implementing data transformation workflows using DBT (Data Build Tool). - Writing efficient, reusable, and reliable code in Python. - Optimizing and tuning data solutions for performance and scalability. - Collaborating with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. - Ensuring data quality and integrity through rigorous testing and validation. - Staying updated with the latest industry trends and technologies in data engineering. Qualifications required for this role include: - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - Proven experience as a Data Engineer or similar role. - Strong proficiency in AWS and Snowflake. - Expertise in DBT and Python programming. - Experience with data modeling, ETL processes, and data warehousing. - Familiarity with cloud platforms and services. - Excellent problem-solving skills and attention to detail. - Strong communication and teamwork abilities.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Role Overview: You will be working as an Informatica MDM professional for one of the clients in India, focusing on expanding their teams. The role is based in Bengaluru with a hybrid work model under a C2H employment type. Your main responsibility will be to demonstrate proficiency in Informatica MDM while also having skills in data integration tools, ETL processes, data modeling, database design principles, performance tuning, and optimization of data processes. Additionally, familiarity with cloud-based data solutions and architectures will be beneficial. Key Responsibilities: - Showcase proficiency in Informatica MDM. - Possess good to have skills such as experience with data integration tools and ETL processes. - Demonstrate a strong understanding of data modeling and database design principles. - Implement performance tuning and optimization of data processes. - Have familiarity with cloud-based data solutions and architectures. Qualifications Required: - Minimum of 5 years of experience in Informatica MDM. - The position is based in Bengaluru. - A 15 years full-time education is required. (Note: Applicants for employment in India should possess work authorization which does not require sponsorship by the employer for a visa.),

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Description Key Responsibilities: Analyse Data Requirements in Sales Cycles: Conduct high-level, rapid assessments or in-depth evaluations for proof-of-concepts(POCs) to identify client data needs early. Serve as the teams data expert, providing actionable advice on feasibility, integration, and optimization, including identifying pattern such as customer churn through data modelling to improve targeting strategies. Develop Implementation Specifications: Collaborate with the technical product manager to define detailed data-related specs, ensuring seamless handoff to the implementation team, with a focus on designing and maintaining ETL pipelines that integrate data from various systems like CRM, billing, and operations. Conduct Data-Focused Testing and QA: Validate implementations through rigorous testing, ensuring data accuracy, integrity, and performance meet enterprise standards, including processing and cleaning large datasets to improve reporting accuracy and system responsiveness. Drive Proactive Data Innovation: Monitor and innovate on data initiatives across RD teams, identifying opportunities for enhancement. Prepare concise summaries and recommendations for the Director of Enterprise Solution Engineering, such as developing automated reporting solutions that reduce manual efforts by significant margins. Data Visualization and Reporting: Create dashboards and reports to support sales demos, client presentations, and internal decision-making, including interactive BI dashboards for tracking trends like customer behaviour, usage, and service levels using tools like Tableau. Client Collaboration and Training: Work directly with clients to refine data requirements and train internal teams on data tools, best practices, and compliance (e.g., GDPR, data security), while engaging stakeholders to define KPIs and deliver tailored visual analytics solutions. Process Optimization: Identify inefficiencies in data workflows, recommend automation or integration strategies, and track metrics to measure solution impact post implementation, such as building centralized dashboards that streamline operational efficiency and resource utilization. Cross-Functional Liaison: Act as a bridge between sales, engineering, and product teams to align on data strategy, including forecasting trends based on industry data benchmarks, and supporting root cause analysis of operational issues with preventive measures. Required Skills and Qualifications: Minimum of 3 years of professional experience in data analysis or related fields, with at least 1 year in designing ETL pipelines, developing BI dashboards, or delivering data-driven insights in enterprise settings. Bachelors degree in a relevant field such as Computer Science, Data Science, Statistics, or Engineering required; Masters degree in Data Analytics or a related discipline preferred. Proficiency in data analysis tools such as SQL, Python (including libraries like Pandas, NumPy, Matplotlib, and Seaborn), ETL processes, and visualization platforms (e.g., Tableau, Jupyter Notebook). Strong understanding of data governance, security, and compliance in enterprise environments, with experience in data pre-processing, feature engineering, and model evaluation Experience in media intelligence or similar data-intensive industries preferred, including healthcare, energy, or enterprise technology domains, with hands-on work in predictive modelling, time series forecasting, and machine learning-driven dashboards. Excellent communication skills for collaborating with technical and non-technical stakeholders, including stakeholder engagement and delivering data-driven decision making insights. Ability to thrive in a fast-paced, innovative setting with a proactive mindset, demonstrated through problem-solving in application support, root cause analysis, and developing automated solutions. This role will report directly to the Director of Enterprise Solution Engineering and contribute to Meltwaters mission of delivering cutting-edge data solutions. What We Offer: Enjoy flexible paid time off options for enhanced work-life balance. Comprehensive health insurance tailored for you. Employee assistance programs cover mental health, legal, financial, wellness, and behaviour areas to ensure your overall well-being. Complimentary Calm App subscription for you and your loved ones, because mental wellness matters Energetic work environment with a hybrid work style, providing the balance you need Benefit from our family leave program, which grows with your tenure at Meltwater Thrive within our inclusive community and seize ongoing professional development opportunities to elevate your career Our Story At Meltwater, we believe that when you have the right people in the right environment, great things happen. Our best-in-class technology empowers our 27,000 customers around the world to make better business decisions through data. But we cant do that without our global team of developers, innovators, problem-solvers, and high-performers who embrace challenges and find new solutions for our customers. Our award-winning global culture drives everything we do and creates an environment where our employees can make an impact, learn every day, feel a sense of belonging, and celebrate each others successes along the way. We are innovators at the core who see the potential in people, ideas and technologies. Together, we challenge ourselves to go big, be bold, and build best-in-class solutions for our customers. Were proud of our diverse team of 2,200+ employees in 50 locations across 25 countries around the world. No matter where you are, youll work with people who care about your success and get the support you need to unlock new heights in your career. We are Meltwater. We love working here, and we think you will too. "Inspired by innovation, powered by people." Equal Employment Opportunity Statement Meltwater is an Equal Opportunity Employer and Prohibits Discrimination and Harassment of Any Kind: At Meltwater, we are dedicated to fostering an inclusive and diverse workplace where every employee feels valued, respected, and empowered. We are committed to the principle of equal employment opportunity and strive to provide a work environment that is free from discrimination and harassment. All employment decisions at Meltwater are made based on business needs, job requirements, and individual qualifications, without regard to race, colour, religion or belief, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, veteran status, or any other status protected by the applicable laws and regulations. Meltwater does not tolerate discrimination or harassment of any kind, and we actively promote a culture of respect, fairness, and inclusivity. We encourage applicants of all backgrounds, experiences, and abilities to apply and join us in our mission to drive innovation and make a positive impact in the world. Show more Show less

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Senior Sales Analytics Specialist at NTT DATA, you will be an advanced subject matter expert responsible for driving the success of sales operations through comprehensive data analysis, valuable insights, and strategic decision support. You will collaborate with cross-functional teams to provide data-driven support for business planning and strategic decision-making by leveraging a deep understanding of the business context. Key Responsibilities: - Accountable for driving tactical and strategic projects with cross-functional virtual teams to achieve specific business objectives. - Analyze complex business problems and issues using internal and external data to provide insights to decision-makers. - Create documented specifications for reports and analysis based on business needs and required or available data elements. - Define, develop, enhance, and track metrics and dashboard requirements to deliver results and provide insight and recommendations on trends. - Validate data using advanced data analysis and tools to ensure analytics are valid, meaningful, and provide actionable insights. - Provide strategic decision support to help the team answer strategic questions and make insightful data-driven business decisions. - Create relevant reports and present on trends to convey actionable insights to stakeholders. - Provide technical advice, consultation, and knowledge to others within the relevant teams. - Perform any other related tasks as required. To thrive in this role, you need to have: - Advanced understanding of advanced data analysis techniques and the ability to uncover strategic insights from data. - Advanced collaboration skills to work effectively with cross-functional teams and senior management. - Excellent communication and presentation skills to convey complex data findings in a clear and actionable manner to non-technical stakeholders. - Proficiency in data analysis tools including advanced Excel, PowerBI, and at least one relevant coding language (e.g., DAX, R, Python). - Understanding of Structured Query Language for managing and querying relational databases. - Knowledge of techniques for transforming and structuring data for analysis and ETL processes. - Understanding of data security and privacy best practices. Academic qualifications and certifications: - Bachelor's degree or equivalent in Data Science or related field. - Relevant sales analytics certifications are desirable. Required experience: - Advanced demonstrated experience in a sales or marketing function as a data analyst. - Proven track record in using PowerBI, statistical and quantitative analysis techniques, data visualizations, and data analysis. - Experience in creating and optimizing reports and dashboards for strategic decision support. - Experience in providing data-driven support for business planning and strategic decision-making. About NTT DATA: NTT DATA is a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With over $3.6 billion invested in R&D annually, NTT DATA serves 75% of the Fortune Global 100 and has diverse experts in more than 50 countries. As a Global Top Employer, NTT DATA offers services including business and technology consulting, data and artificial intelligence, industry solutions, and the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is dedicated to helping organizations and society move confidently and sustainably into the digital future.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Role Overview: At Toyota Connected, you will have the opportunity to change the way the world works, transform the automotive industry, and positively impact others on a global scale. In a collaborative and fast-paced environment, you will focus on continual improvement and work iteratively to deliver exceptional value through connected products and services that amaze and delight customers worldwide. Join us in re-imagining what mobility can be today and for years to come! Key Responsibilities: - Design, develop, and maintain automation test frameworks and scripts specifically for API-level testing. - Validate and verify complex Machine Learning applications, ensuring accurate performance of models and data pipelines in production environments. - Conduct extensive testing on large-scale data engineering pipelines, including ETL processes, data warehousing solutions, and big data processing frameworks. - Identify, document, and track software defects, collaborating with developers and product teams for resolution. - Execute load and performance testing on APIs and data pipelines to ensure scalability and efficiency. - Collaborate with data engineers, ML engineers, software developers, and product teams to define comprehensive test strategies. - Participate actively in agile ceremonies, contributing to sprint planning, reviews, and retrospectives. Qualifications Required: - Bachelor's degree in Computer Science, Information Technology, or related fields. - 3+ years of hands-on experience in automation testing, particularly focused on API testing using tools like Postman, REST Assured, or similar. - Experience with automation frameworks such as Cucumber/Karate for API automation and selenium/cypress for web automation. - Demonstrable experience in testing Machine Learning applications, including model validation, accuracy assessments, and production-readiness. - Proven expertise in testing large-scale data engineering pipelines involving technologies like Apache Spark, Hadoop, AWS EMR, Kafka, or similar. - Strong scripting/programming skills in Python, Java, or JavaScript. - Familiarity with containerization and CI/CD tools (Docker, Kubernetes, Jenkins, GitLab CI/CD). - Excellent analytical and problem-solving abilities with a strong attention to detail. - Effective communication skills and the ability to clearly document test cases, defects, and technical findings. Additional Details: Toyota Connected values its employees and offers top-of-the-line compensation, yearly gym membership reimbursement, free catered lunches, and a flexible dress code policy. Employees are entrusted to manage their own time and workload. The company provides an opportunity to work on products that enhance the safety and convenience of millions of customers, along with a cool office space and other benefits. For more information about Toyota Connected, you can visit their Glassdoor page at [TOYOTA Connect Glassdoor Page](https://www.glassdoor.co.in/Reviews/TOYOTA-Connect).,

Posted 4 days ago

Apply

7.0 - 12.0 years

6 - 11 Lacs

bengaluru

Work from Office

Key Responsibilities : Lead the design, development, and deployment of end-to-end data solutions on Azure Databricks platform. Work with data scientists, data engineers, and business analysts to design and implement data pipelines and machine learning models. Develop efficient, scalable, and high-performance data processing workflows and analytics solutions using Databricks , Apache Spark , and Azure Synapse . Manage and optimize Databricks clusters and data pipelines. Collaborate with cross-functional teams to gather requirements and deliver optimal solutions. Design and implement ETL processes using Databricks Notebooks, Azure Data Factory, and other Azure services. Ensure high availability, performance, and security of cloud-based data solutions. Implement best practices for data quality, security, and governance. Monitor system performance and troubleshoot any issues related to Databricks clusters or data pipelines. Stay up-to-date with the latest advancements in cloud computing, big data, and machine learning technologies.

Posted 4 days ago

Apply

7.0 - 10.0 years

6 - 10 Lacs

bengaluru

Work from Office

JLLs mission is to enable our teams to deliver future proof, consistent, scalable, customer centric services and experiences. At JLL@Google, we value what makes you unique, and were committed to giving you the opportunity, knowledge and tools to own your success. Explore opportunities to advance your career from within, whether youre looking to move up, broaden your experience or deepen your expertise. As the business continues to grow, Snr. Tech & BI Lead roles will help underpin the business transformation. It will support steering the technical ship from a data, insights and reporting perspective. It will be heavily business facing and will be highly regarded internally. As a Snr Tech & BI Lead, your insights will be valued and you'll work closely with senior level executive stakeholders to help drive decision making. We are looking for an engineer working in the Business Intelligence area: Excellent planning and organizational skills to prioritize work and meet deadlines Hands-on experience in Business Intelligence tools and databases problem solver with strong analytical skills with experience in solutions delivery for both internal and external clients Experience in using visualization software like Tableau, Power BI, Google Looker for development Experience or willing to learn data analytics in cloud platforms like GCP, Azure If this describes you do not hesitate to apply. In this role you will be focusing on designing and delivering eye-catching dashboard solutions through your technical and business expertise. Responsibilities: Data Management, Design and develop visually appealing and informative dashboards using Tableau or Power BI. Experience with Google Plx, Googles visualization platform is a plus. Collaborate closely with stakeholders to understand their business needs and translate them into actionable data insights. Leverage SQL, DataBricks, Azure functionalities, and other relevant tools to extract, transform, and load data for analysis. Identify trends, patterns, and opportunities within data to support strategic decision-making. Provide guidance to clients on leveraging data to improve their operations and achieve their goals. Collaborate with the Global COE TaDS team for dashboard deployment as well as ongoing support. Project Management & Organizational Skills Excellent planning & organizational skills to prioritize work and meet tight deadlines. Flexible able to adapt to rapidly changing situations. Oversee the dashboard development schedule against milestones. Manage any scope changes. Communicate any risks identifi ed. Technology Applications Assist users for technology platforms within JLL Technologies (JLLT) and third-party vendors. Assist the automation and optimization efforts, utilizing app scripts, to enhance productivity and reduce manual effort. Collaborate closely with internal stakeholders, including senior leaders and cross-functional teams, to gather requirements and provide technology-related solutions. Collaborate with the Global COE TaDS team for application deployment as well as ongoing support. Problem Solving & Strategic Thinking Resourceful - ability to deal with highly ambiguous circumstances in a rapidly changing environment. Capacity to solve complex problems effectively. Analytical, proven ability to solve problems using a quantitative approach. Proven ability to employ holistic approaches and look at long term solutions. ? Open to new ideas & willing to challenge the status quo. Client Focus Proactively develop and manage client partner relationships. Work closely with the client stakeholders to translate their Objectives and Key Results (OKRs) into programs which deliver the desired results. ? Deliver an exceptional quality of service, as refl ected by clients feedback. Requirements: 7+ years of hands-on experience with Tableau, Looker or Power BI. Strong analytical skills and problem-solving abilities. Knowledge of SQL, DataBricks, Alteryx, Azure Data Factory, or similar technologies. Knowledge of data modeling, transformation, and ETL processes. Understanding of data warehousing and cloud platforms (Azure, GCP, AWS). Excellent communication skills, both written and verbal. Ability to work collaboratively with stakeholders from diverse backgrounds. A passion for learning and staying up-to-date with the latest trends in data analytics. Fluency in English is essential. What you can expect from us: We succeed together and believe the best inspire the best, so we invest in supporting each other, learning together and celebrating our success. Our Total Rewards program refl ects our commitment to helping you achieve your career ambitions, recognizing your contributions, investing in your well-being and providing competitive benefi ts and pay. Location: On-site Bengaluru, KA Scheduled Weekly Hours: 40

Posted 4 days ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

lucknow

Remote

Title Data Engineer 100% Work from Home / Remote Work Timings - 1:30 PM – 10:30 PM Fulltime Role Job Summary As a Data Engineer , you will be responsible for designing, building, and maintaining the infrastructure and tools that enable the organization to collect, store, and analyze large volumes of data. You will work closely with data scientists, analysts, and other engineers to ensure that our data systems are reliable, efficient, and scalable. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable and reliable data pipelines to process large datasets from various sources. Data Modeling: Collaborate with data scientists and analysts to design and implement data models that support their analytical needs. ETL Processes: Build and optimize Extract, Transform, Load (ETL) processes to ensure data is accurately ingested and transformed. Data Quality: Implement data quality checks and validation processes to ensure the integrity of the data. Data Integration: Integrate data from multiple sources, including APIs, flat files, and databases, into a unified data environment. Performance Tuning: Monitor and optimize data processing performance to meet the demands of the organization. Documentation: Create and maintain documentation for data pipelines, data models, and other related processes. Collaboration: Work closely with cross-functional teams to understand data needs and provide solutions that support business goals. Compliance and Security: Ensure data handling practices comply with relevant data privacy regulations and security standards. Qualifications : Experience 3+ years of experience in data engineering domain. Technical Skills: Experience with ETL process (data extraction, data cleansing, data transformation, data monitoring, data modelling) Experience with Microsoft Azure ( Azure Data Factory , Azure Databricks , Azure Storage, Azure SQL Database, Azure Synapse) Familiarity with Big Data systems (batch & stream processing, Spark, Hive) Experience with two or more coding languages (including Python with pyspark ) Experience with SQL ( Spark SQL , T-SQL), Familiarity with code versioning (GIT, Azure DevOps) Experience with various data formats (e.g. CSV, JSON, Parquet, Delta Lake) Knowledge of distributed systems concepts and principles (consistency, availability, safety, durability, reliability, fault-tolerance) Soft Skills: Strong problem-solving and analytical skills Excellent communication and collaboration skills Ability to work in a fast-paced, dynamic environment Eager to learn new things and passionate about technology

Posted 4 days ago

Apply

6.0 - 8.0 years

25 - 32 Lacs

bengaluru

Work from Office

MofoHPTech Dynamics is hiring Sr Power BI/Data Engineer | Bengaluru | 6-8 years Experience Apply directly via our ATS link: https://www.prolegion.com/marketplace/e40fd159-3ce3-4b0f-95be-69baf0cdaea8/sr-power-bi-data-engineer?utm_source=naukri

Posted 5 days ago

Apply

6.0 - 8.0 years

24 - 32 Lacs

bengaluru, karnataka, india

Remote

Job Title : Sr. Power BI/Data Engineer Location : Hybrid Bengaluru, Karnataka, India Client Name : MofoHPTech Dynamics About the Role We are seeking an experienced Sr. Power BI/Data Engineer to design, implement, and optimize robust data models, pipelines, and reporting solutions. The ideal candidate will bring deep expertise in data architecture, data integration, and BI tools, with strong problem-solving skills and the ability to translate business needs into scalable data strategies. Key Responsibilities Design, develop, and implement scalable data models and pipelines to support business objectives. Collaborate with cross-functional teams to gather requirements and deliver actionable data solutions. Manage data integration and ETL processes, ensuring accuracy, performance, and reliability. Oversee data warehousing solutions, optimizing for efficiency and availability. Develop and maintain Power BI dashboards and reports to deliver clear insights to stakeholders. Implement data governance and security measures to protect sensitive information. Monitor data health, troubleshoot issues, and maintain high levels of data integrity. Mentor junior data engineers and promote best practices in data engineering and BI reporting. Stay up to date with emerging data engineering tools, technologies, and trends. Required Qualifications Proven experience as a Data Engineer with expertise in Power BI. Strong knowledge of data architecture, ETL design, and data management principles. Hands-on experience in data warehousing solutions and database management systems. Ability to translate technical concepts into actionable business insights. Strong problem-solving, debugging, and troubleshooting skills. Excellent communication skills with the ability to work with both technical and non-technical stakeholders. Experience working in a hybrid setup with onsite and remote teams. Preferred Skills Expertise in SQL, DAX, and data modeling in Power BI. Familiarity with Azure Data Factory, Synapse, or other cloud-based data services. Experience with Python/Scala/Java for data processing. Knowledge of data governance frameworks and compliance standards. Leadership experience in mentoring and guiding junior engineers. Note: 1. Applications will only be accepted through the ATS link provided above. Profiles shared through other means will not be considered. 2. Prolegion does not charge any fee from candidates at any stage. If anyone approaches you for money in exchange for this opportunity, treat it as fraud and report it to us immediately.

Posted 5 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Metrics Automation Specialist, you will play a crucial role in designing, implementing, and managing automated solutions to gather metrics from various tools like ITSM platforms, Grafana, and other monitoring systems. Your expertise in automation tools, ETL processes, and data visualization platforms will be pivotal in defining metric calculations, calculating KPIs, and ensuring data accuracy and completeness across all sources. Collaborating with stakeholders, you will develop visually compelling dashboards and reports to present insights and trends in a clear and actionable format. Your key responsibilities will include designing and implementing automated workflows for gathering metrics, using ETL tools to extract, transform, and load data, defining metric calculations and KPIs aligned with organizational goals, and creating visually compelling dashboards and reports for stakeholders. Additionally, you will interface with managers across teams to ensure timely data submission, drive accountability for meeting monthly deadlines, and identify opportunities for process improvement and automation. To excel in this role, you should possess proficiency in automation tools like Microsoft Power Automate, Power Query, or similar platforms, experience with ETL processes, strong knowledge of data visualization tools such as Tableau, Microsoft Power BI, or Excel, and familiarity with ITSM tools and monitoring platforms. Strong communication and interpersonal skills, persuasion and negotiation skills, and the ability to explain technical concepts clearly to non-technical audiences are essential. Your proven experience in automating data collection, KPI calculation, and reporting workflows, along with a bachelor's degree in Data Analytics, Computer Science, Information Systems, or a related field, will be valuable assets. In summary, your role as a Metrics Automation Specialist will involve automating workflows for metric collection, defining metric calculations and KPIs, creating visually compelling dashboards and reports, engaging with stakeholders to ensure timely data submission, and driving process improvement and automation initiatives to enhance data management and reporting workflows.,

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Senior Sales Analytics Specialist at NTT DATA, you will play a crucial role in driving the success of sales operations through comprehensive data analysis, providing valuable insights, and offering strategic decision support. You will be an advanced subject matter expert operating within a multifaceted environment, collaborating with cross-functional teams to achieve specific business objectives. Your primary responsibility will be to provide data-driven support for business planning and strategic decision-making by leveraging a deep understanding of the business context. This will involve analyzing complex business problems, identifying trends and patterns in relevant datasets, and interpreting data to inform business decisions. Key responsibilities include driving tactical and strategic projects with cross-functional teams, creating documented specifications for reports and analysis, defining and tracking metrics and dashboard requirements, and providing technical advice and consultation to relevant teams. You will also be responsible for data validation, strategic decision support, and presenting actionable insights to stakeholders. To excel in this role, you should have an advanced understanding of data analysis techniques, business sales objectives, and market dynamics. Strong collaboration skills, the ability to translate complex data insights into actionable strategies, and excellent communication and presentation skills are essential. Proficiency in data analysis tools such as advanced Excel, PowerBI, and relevant coding languages, as well as knowledge of SQL for managing relational databases, are required. Academic qualifications include a Bachelor's degree or equivalent in Data Science or a related field, with relevant sales analytics certifications being desirable. You should have advanced experience in a sales or marketing function as a data analyst, proficiency in PowerBI and statistical analysis techniques, and a proven track record in creating and optimizing reports and dashboards for strategic decision support. This role offers remote working opportunities and embraces diversity and inclusion. NTT DATA is an Equal Opportunity Employer committed to fostering a workplace where you can continue to grow, belong, and thrive.,

Posted 5 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a highly skilled and experienced Senior Data Engineer at Magna, you will lead our data engineering & analytics team by designing, implementing, and managing our data infrastructure and analytical systems. Your role will involve utilizing your exceptional technical expertise in data modeling, ETL processes, and database management to ensure efficient and accurate data flow using Databricks. You will play a crucial part in leading and managing a team of data engineers, providing technical guidance, mentoring, and support to drive the team's success. Collaborating with cross-functional teams, you will gather requirements, analyze data, and design effective solutions while implementing and enforcing architectural standards and frameworks to maintain a flexible and scalable data environment. Your responsibilities will include designing, developing, and implementing data models, ETL processes, and data pipelines, as well as hands-on development of Python/ Spark-based scripts and applications to support data processing and transformation. Additionally, you will optimize and enhance existing data infrastructure and systems for performance, scalability, and reliability, playing a key role in DevOps activities such as deployment of jobs and required infrastructure setup. To excel in this role, you should possess a degree in computer science or a related field, along with 6-10 years of experience in Data Engineering & Analytics. Your strong understanding of Data warehousing, engineering & ETL concepts, experience in Databricks development using Python, PySpark & DLT Pipelines, and working knowledge of databases and various data storage types will be invaluable. Excellent communication and interpersonal skills, time and project management abilities, goal orientation, and holistic thinking are crucial for success in this role. Experience with Azure / AWS and certifications in relevant data technologies will be advantageous. Your skills in team leadership, troubleshooting, analytical thinking, communication, collaboration, and client management will be essential to thrive in this dynamic role at Magna. In this engaging and dynamic environment, you can expect exciting, varied responsibilities and a wide range of development prospects as you contribute to advancing mobility in an expanded transportation landscape. Magna, as a mobility technology company, offers a unique opportunity to innovate and make a meaningful impact with a global, entrepreneurial-minded team. Join us and be part of shaping the future of automotive technologies at Magna.,

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Technology Engineer, you will be utilizing your expertise in Microsoft SQL Server (MSSQL) and hands-on experience with Vector Databases to design, implement, and optimize database solutions for high-performance, large-scale applications. Your key responsibilities will include designing and maintaining MSSQL databases for high availability, performance, and security, developing complex queries and stored procedures, implementing backup and recovery strategies, and working with Vector Databases to support AI/ML and semantic search use cases. Additionally, you will optimize data pipelines, collaborate with application development teams, monitor database performance, and ensure compliance with security and data governance policies. You should have 5 to 7 years of professional experience in database engineering/administration, strong expertise in MSSQL Server, hands-on experience with at least one Vector Database, proficiency in SQL programming, knowledge of ETL processes and data pipelines, and familiarity with cloud environments and managed database services. Experience with AI/ML workloads, semantic search, and embeddings-based data retrieval is a plus. Excellent problem-solving, analytical, and communication skills are essential for this role. It would be beneficial to have knowledge of NoSQL databases, exposure to Python or Java for integration with vector DBs, and experience with Kubernetes and containerized database deployments. Staying updated on emerging trends in database technologies, particularly in vector search and AI-driven data management, is also important for this position.,

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Sales Analytics Specialist at NTT DATA, you will play a crucial role in driving the success of sales operations through comprehensive data analysis, valuable insights, and strategic decision support. Your seasoned expertise will be utilized to provide data-driven support for business planning and strategic decision-making, ultimately contributing to the achievement of specific business objectives. Operating within a multifaceted environment, you will collaborate with cross-functional teams to analyze complex business problems and issues using internal and external data. By identifying trends, patterns, and influences in relevant datasets, you will proactively offer meaningful quantitative analysis to inform business decisions. Your responsibilities will also include creating documented specifications for reports, defining metrics and dashboard requirements, and ensuring data validation using advanced data analysis tools. In addition to your technical skills, you will leverage your seasoned understanding of advanced data analysis techniques and business sales objectives to align data analysis with strategic goals effectively. Your collaboration skills will enable you to work effectively with cross-functional teams and senior management, translating complex data insights into actionable strategies for non-technical stakeholders. Your role will also involve providing technical advice, consultation, and knowledge to relevant teams, creating and presenting reports on trends, and supporting the team in answering strategic questions and making insightful data-driven decisions. You will utilize tools such as advanced Excel, PowerBI, and relevant coding languages, as well as demonstrate expertise in SQL for managing and querying databases. To qualify for this role, you should hold a Bachelor's degree or equivalent in Data Science or a related field, along with relevant sales analytics certifications. Your experience should include a proven track record in sales or marketing functions as a data analyst, proficiency in PowerBI and statistical analysis techniques, and creating reports that contribute to strategic decision support. As part of a remote working environment at NTT DATA, you will have the opportunity to leverage your skills and expertise to drive innovation, optimize processes, and transform businesses for long-term success. Join us in our commitment to helping organizations move confidently into the digital future, as we strive to make a positive impact on our clients and society.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

The Retail Specialized Data Scientist will play a pivotal role in utilizing advanced analytics, machine learning, and statistical modeling techniques to help the retail business make data-driven decisions. This individual will work closely with teams across marketing, product management, supply chain, and customer insights to drive business strategies and innovations. The ideal candidate should have experience in retail analytics and the ability to translate data into actionable insights. Key Responsibilities: - Utilize deep understanding of the retail industry to design AI solutions for critical business needs. - Gather and clean data from various retail sources like sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. - Apply machine learning algorithms for classification, clustering, regression, and deep learning to enhance predictive models. - Utilize AI-driven techniques for personalization, demand forecasting, and fraud detection. - Implement advanced statistical methods to optimize existing use cases and develop new products. - Stay informed about the latest trends in data science and retail technology. - Collaborate with executives, product managers, and marketing teams to translate insights into business actions. Professional Skills: - Strong analytical and statistical skills. - Expertise in machine learning and AI. - Experience with retail-specific datasets and KPIs. - Proficiency in data visualization and reporting tools. - Ability to work with large datasets and complex data structures. - Strong communication skills for interaction with technical and non-technical stakeholders. - Solid understanding of the retail business and consumer behavior. Technical Skills: - Programming Languages: Python, R, SQL, Scala - Data Analysis Tools: Pandas, NumPy, Scikit-learn, TensorFlow, Keras - Visualization Tools: Tableau, Power BI, Matplotlib, Seaborn - Big Data Technologies: Hadoop, Spark, AWS, Google Cloud - Databases: SQL, NoSQL (MongoDB, Cassandra) Minimum Qualifications: - Experience: Minimum 3 years of relevant experience. - Educational Qualification: Bachelors or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field. Location: Bangalore / Gurgaon / Mumbai / Chennai / Pune / Hyderabad / Kolkata Must-Have Skills: - Solid understanding of retail industry dynamics and key performance indicators. - Ability to communicate complex data insights to non-technical stakeholders. - Meticulous in ensuring data quality and consistency. - Proficiency in Python for data manipulation, statistical analysis, and machine learning. - Experience with supervised and unsupervised learning algorithms. - Use advanced analytics to optimize pricing strategies based on market demand. Good-to-Have Skills: - Familiarity with big data processing platforms like Apache Spark, Hadoop, or cloud-based platforms. - Experience with ETL processes and tools like Apache Airflow. - Familiarity with designing scalable and efficient data pipelines and architecture. - Experience with data visualization tools like Tableau, Power BI, Matplotlib, and Seaborn.,

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies