Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job Requirements VE & Should Cost Engineer Electronics. Capture Manufacturing related commodities by referring various engineering drawings helpful in Should Cost and Inventory requirements. Data extraction from engineering drawings would focus on the ability to extract and interpret data from technical drawings to support tasks like cost estimation, design verification, and material ordering. Have experience with PCB design specification, DFM, DFA. Full understanding of PCB and PCBA manufacturing and assembly processes. Contribute to component library development for schematic symbols and footprints per IPC standards. Conduct teardowns of competing product Present results from ongoing individual workgroup competitive assessments Collaborate with the Advanced Manufacturing & Supply Chain Engineering Global Supply Management divisions in developing and maintaining cost methods and models Collaborate with cross-functional teams within the VECE COE to implement solutions and drive the successful execution of component engineering initiatives
Posted 2 weeks ago
5.0 - 10.0 years
0 - 1 Lacs
Chennai
Work from Office
Duties and Responsibilities Lead the data management and governance track in designing and implementing the data strategy and data architecture underpinning an MDM solution Assess master data maturity level across the organization and build use cases and recommend implementation style for various master data domains. Define design and develop MDM Data model, business rules, workflows and Data transformation r ules as per business needs. Design and develop ways to ensure master data integrity and data quality in key systems. Design data quality process and to improve data quality through error detection and correction. Ensure quality of master data in key systems, as well as, development and documentation of processes with other functional data owners to support ongoing maintenance and data integrity. Provide daily oversight and support for MDM operations Perform and oversee enhancements and defect fixing in MDM solutions. Collaborate with business users and stakeholders to understand requirements and provide strategic solutions. Ensure data quality and governance standards are maintained. Lead testing and validation of MDM solutions. Document technical specifications and processes. Mentor and guide junior team members. Assist in the development and implementation of MDM projects" Skills 5+ years of experience in Data Management and Governance, including working with Enterprise Data Warehouses, Data Platforms and MDM solutions. Proven experience in leading MDM projects. Strong understanding of SAP MDG processes and functionalities. Experience in enhancement and defect fixing in SAP MDG. Excellent consulting and stakeholder management skills. Good knowledge of data governance and data quality principles. Ability to work collaboratively in a team environment. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Ability to work independently with general direction and flexibility in a fast-paced environment" Technical / Functional Skills SAP MDG: Core expertise in SAP Master Data Governance. SAP S/4HANA: Experience with SAP's next-generation business suite. ABAP: Knowledge of Advanced Business Application Programming for custom development. Fiori: Familiarity with SAP Fiori for user experience enhancements. Data Services: Understanding of SAP Data Services for data integration and quality. Integration Tools: Experience with integration tools and techniques for connecting SAP MDG with other systems. Strong Data architecture and database skills from consuming to rendering for MDM implementation Diverse experience in application tools, languages, and frameworks (SQL, Python, Java etc) Project Experience Experience in integrating MDM solutions with SAP ECC, SAP S/4 HANA Experience in implementing multi-level workflows and approval process within MDM Experience in design and development of hierarchies, third party integration, match and merge strategies Implementation experience in two or more of the core MDM domains like Vendor, Customer, Product, Material Candidate should have strong MDM experence with some understanding on Data Governance Experience architecting an MDM end to end implementation covering data extraction, Cleansing /Profiling , Initial data load, centralized creation and integration with consuming systems
Posted 2 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
Jaipur
Work from Office
Job Title: Power BI Developer Location: Jaipur, Rajasthan (Onsite) Work Hours: 5:30 PM 2:30 AM IST Employment Type: Full-Time / Contract Job Description: We are looking for a skilled Power BI Developer to join our team onsite in Jaipur. The ideal candidate will have strong experience in data cleaning , reporting , and data visualization , along with the ability to work in a fast-paced, EST-aligned environment. Key Responsibilities: Clean, transform, and prepare data from multiple sources for business intelligence reporting. Design, build, and maintain interactive dashboards and reports using Power BI . Work closely with business stakeholders to understand reporting requirements and translate them into technical solutions. Develop and publish Power BI datasets, reports, and dashboards using best practices. Collaborate with the data engineering and analytics teams for seamless data integration. Create ad-hoc and scheduled reports based on business needs. Required Skills: 3+ years of experience working as a Power BI Developer or in a similar BI role. Proficiency in Power BI , DAX , and Power Query . Experience in data cleaning , data modeling , and data transformation . Strong understanding of data visualization principles and best practices. Working knowledge of Tableau is a plus. Solid SQL skills for querying and extracting data. Strong problem-solving and communication skills. Work Schedule: This is a full-time onsite role that requires working during US Eastern Time (8:00 AM to 5:00 PM EST / 5:30 PM to 2:30 AM IST ). Location: Jaipur, Rajasthan – candidates must be willing to work from the office.
Posted 2 weeks ago
2.0 - 7.0 years
2 - 6 Lacs
Bharuch, Ankleshwar
Work from Office
DISPLAY - KNOWLEDGE, EXPERIENCE Education qualification Bachelors degree in Engineering, Computer Science, Data Science, Information Systems, or a related field. Overall Experience* Fresher or Relevant manufacturing industry experience Mandatory Experience* 1.Understanding of statistical methods and basic analysis techniques. 2. Knowledge of data visualization tools 3. Basic understanding of data cleaning and preprocessing techniques. 4. Good problem-solving skills and attention to detail. 5. Strong communication skills, both verbal and written, to explain data findings clearly. 6. Eagerness to learn new data analysis techniques and tools. OBJECTIVE OF THE ROLE* To creatively execute designs and drawings of all jobs/projects in timely manner. ACCOUNTABILITIES OF THE ROLE Responsibility 1. Assist in collecting, cleaning, and organizing data from various sources. 2. Help in performing basic data analysis, identifying trends, patterns with outliers and give meaningful insight to management on the same. 3. Support in creating data visualizations and reports to communicate insights to the team. 4. Assist in preparing dashboards and presentations to present data findings. 6. Collaborate with team members and assist in gathering data requirements for projects. 7. Help in maintaining and updating data systems, ensuring data is properly structured for analysis. 8. Learn and apply data analysis tools and techniques as part of on-the-job training.
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
delhi
On-site
As a Python Developer at Innefu Lab, you will play a crucial role in the software development life cycle, contributing from requirements analysis to deployment. Working in collaboration with diverse teams, you will design and implement solutions that align with client requirements and industry standards. Your responsibilities encompass various key areas: Software Development: You will be responsible for creating, testing, and deploying high-quality Python applications and scripts. Code Optimization: Your role involves crafting efficient, reusable, and modular code while enhancing existing codebases for optimal performance. Database Integration: You will integrate Python applications with databases to ensure data integrity and efficient data retrieval. API Development: Designing and implementing RESTful APIs to enable seamless communication between different systems. Collaboration: Working closely with UI/UX designers, backend developers, and stakeholders to ensure effective integration of Python components. Testing and Debugging: Thoroughly testing applications, identifying and rectifying bugs, and ensuring software reliability. Documentation: Creating and maintaining comprehensive technical documentation for code, APIs, and system architecture. Continuous Learning: Staying updated on industry trends, best practices, and emerging technologies related to Python development. Required Skills: - Proficient in Python, Django, Flask - Strong knowledge of Regular Expressions, Pandas, Numpy - Excellent expertise in Web Crawling and Web Scraping - Experience with scraping modules like Selenium, Scrapy, Beautiful Soup, or URLib - Familiarity with text processing, Elasticsearch, and Graph-Based Databases such as Neo4j (optional) - Proficient in data mining, Natural Language Processing (NLP), and Optical Character Recognition (OCR) - Basic understanding of databases - Strong troubleshooting and debugging capabilities - Effective interpersonal, verbal, and written communication skills - Ability to extract data from structured and unstructured sources, analyze text, images, and videos, and utilize NLP frameworks for data enrichment - Skilled in collecting and extracting intelligence from data, utilizing regular expressions, and extracting information from RDBMS databases - Experience in web scraping frameworks like Scrapy for data extraction from websites Join us at Innefu Lab, where innovative offerings and cutting-edge technologies converge to deliver exceptional security solutions. Be part of our dynamic team driving towards excellence and growth in the cybersecurity domain.,
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG s best practices policies and procedures. This role requires quick ramp up on new technologies whenever required. Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. . Role : Azure Data Engineer Location: Bangalore Experience: 4 to 6 years Data Management : Design, implement, and manage data solutions on the Microsoft Azure cloud platform. Data Integration : Develop and maintain data pipelines, ensuring efficient data extraction, transformation, and loading (ETL) processes using Azure Data Factory. Data Storage : Work with various Azure data storage solutions like Azure SQL Database, Azure Data Lake Storage, and Azure Cosmos DB. Big Data Processing : Utilize big data technologies such as Azure Databricks and Apache Spark to handle and analyze large datasets. Data Architecture : Design and optimize data models and architectures to meet business requirements. Performance Monitoring : Monitor and optimize the performance of data systems and pipelines. Collaboration : Collaborate with data scientists, analysts, and other stakeholders to support data-driven decision-making. Security and Compliance : Ensure data solutions comply with security and regulatory requirements. Technical Skills : Proficiency in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data tools. Analytical Skills : Strong analytical and problem-solving skills. Communication : Excellent communication and teamwork skills. Certifications : Relevant certifications such as Microsoft Certified: Azure Data Engineer Associate are a plus.
Posted 3 weeks ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 3 weeks ago
3.0 - 7.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Requirement SAP BODS Number of Openings* 2 ECMS Request No (Sourcing Stage)* 532551 Duration of Contract* 6 Total Years of Experience* 9+ Relevant Years of Experience* 6-8 Detailed JD (Roles and Responsibilities)* SAP BODS should have good experience on Data Extraction, Data Validation, Data Integration using SAP Bods 2. Should be Proficient in creating jobs, workflows, Dataflows and should have work experience on Transforms 3. Should have SAP Data Migration experience 4. Should have experience on LTMC, IDOCs,BAPI Function modules etc. Data migration: For Data Migration, Should have worked on end to end migration of objects such as master as well as Transactional data Other points: Able to gather the requirements independently, develop as per the business requirement, track the deliverables from the team, Prepare and share the status reports with the client. Mandatory Skills* SAP BODS Desired Skills* Technology SAP BODS Domain* Consultant Approx. Vendor Billing Rate (Excl.
Posted 3 weeks ago
3.0 - 6.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 3 weeks ago
3.0 - 8.0 years
25 - 30 Lacs
Mumbai
Work from Office
Job Overview PitchBook s Product Owner works collaboratively with key Product stakeholders and teams to deliver the department s product roadmap. This role takes an active part in aligning our engineering activities with Product Objectives across new product capabilties as well as data and scaling improvements to our core technologies, with a focus on AI/ML data extraction, collection, and enrichment capabilities. Team Overview The Data Technology team within PitchBook s Product organization develops solutions to support and accelerate our data operations processes. This domain impacts core workflows of data capture, ingestion, and hygiene across our core private and public capital markets datasets. This role works on our AI/ML Collections Data Extraction & Enrichment teams, closely integrated with Engineering and Product Management to ensure we are delivering against our Product Roadmap. These teams provide backend AI/ML services that power PitchBook s data collection activities and related internal conent management systems. Outline of Duties and Responsibilities Be a domain expert for your product area(s) and understand user workflows and needs Actively define backlog priority for your team(s) in collaboration with Product and Engineering Manage delivery of features according to the Product Roadmap Validate the priority and impact of incoming requirements from Product Management and Engineering Break down prioritized requirements into well-structured backlog items for the engineering team to complete Create user stories and acceptance criteria that indicate successful implementation of requirements Communicate requirements, acceptance criteria, and technical details to stakeholders across multiple PitchBook departments Define, create, and manage metrics that represent team performance Manage, track, and mitigate risks or blockers of Feature delivery Support execution of AI/ML collections work related but not limited to AI/ML data extraction, collection, and enrichment services. Support PitchBook s values and vision Participate in various company initiatives and projects as requested Experience, Skills and Qualifications Bachelors degree in Information Systems, Engineering, Data Science, Business Administration, or a related field 3+ years of experience as a Product Manager or Product Owner within AI/ML or enterprise SaaS domains A proven record of shipping high impact data pipeline or data collection-related tools and services Familiarity with AI/ML workflows, especially within model development, data pipelines, or classification systems Experience collaborating with globally distributed product engineering and operations teams across time zones Excellent communication skills to drive clarity and alignment between business stakeholders and technical teams Bias for action and a willingness to roll up your sleeves and do what is necessary to meet team goals Experience translating user-centric requirements and specifications into user stories and tasks Superior attention to detail including the ability to manage multiple projects simultaneously Strong verbal and written communication skills, including strong audience awareness Experience with shared SDLC and workspace tools like JIRA, Confluence, and data reporting platforms Preferred Qualifications Direct experience with applied AI/ML Engineering services. Strong understanding of supervised and unsupervised ML models, including their training data needs and lifecycle impacts Background in fintech supporting content collation, management, and engineering implementation Experience with data quality measurements, annotation systems, knowledge graphs, and ML Model evaluation Exposure to cloud -based ML infrastructure and data pipeline orchestration tools such as AWS SageMaker, GCP Vertex AI, Airflow, and dbt Certifications related to Agile Product Ownership / Product Management such as CSPO, PSPO, POPM are a plus Working Conditions The job conditions for this position are in a standard office setting. Employees in this position use PC and phone on an on-going basis throughout the day. This role collaborates with Seattle and New York-based stakeholders, and typical overlap is between 6:30 - 8:30AM Pacific. Limited corporate travel may be required to remote offices or other business meetings and events. 037_PitchBookDataInc PitchBook Data, Inc Legal Entity
Posted 3 weeks ago
3.0 - 5.0 years
4 - 7 Lacs
Noida
Work from Office
Join our Team About this opportunity: We are looking for an experienced RPA/Automation Developer with 3 to 5 years of expertise in UiPath to design, develop, and implement scalable automation solutions. The candidate will work closely with cross-functional teams to automate complex business processes, improve operational efficiency, and ensure seamless integration of RPA solutions. What you will do: Develop, test, deploy, and maintain end-to-end automation workflows using UiPath Studio and Orchestrator. Analyze business processes to identify automation opportunities and collaborate with stakeholders to gather detailed requirements. Build reusable components and libraries to accelerate automation development. Monitor, troubleshoot, and resolve issues related to deployed automation bots. Optimize existing automation solutions for improved performance and scalability. Ensure compliance with coding standards, best practices, and governance policies. Prepare comprehensive technical documentation and participate in knowledge-sharing sessions. Collaborate with IT, business, and QA teams to drive continuous improvement in automation lifecycle management. Mentor junior developers and contribute to team skill development. The skills you bring: Bachelor s degree in Computer Science, Engineering, or related field. 3 to 5 years of hands-on experience in RPA development using UiPath. Strong proficiency in UiPath Studio, Orchestrator, Robots, and related tools. Solid experience in process analysis, design, and implementation of automation solutions. Basic Knowledge of Python is must Proficient in scripting languages such as VB.NET/C# Familiarity with SQL and database querying for data extraction and manipulation. Understanding of REST/SOAP APIs and integration with external systems. Experience working in Agile environments and using DevOps tools is preferred. Strong problem-solving skills with attention to detail. Excellent communication skills to interact effectively with technical and non-technical stakeholders. Preferred Qualifications UiPath Certified Advanced RPA Developer or equivalent certification. Experience with other RPA tools and automation frameworks. Knowledge of cloud platforms (Azure, AWS) and containerization technologies. Exposure to AI/ML integration with RPA is a plus. What happens once you apply? We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop. Primary country and city: India (IN) || Noida
Posted 3 weeks ago
8.0 - 13.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Job description Required Skills and Qualifications: Proven experience working with BI Tools (strong background with similar BI tools like Power BI, Tableau, etc., but Bold BI preferred ). Strong experience connecting to and working with SQL Server, MySQL, PostgreSQL, Oracle, MongoDB, and REST APIs. Python programming skills for developing custom visualizations and data processing tasks. Ability to write and optimize SQL queries for data extraction and transformation. Familiarity with cloud environments (AWS, Azure, GCP) for application & database hosting and integrations. Understanding of ETL processes and data integration strategies. Familiarity with data visualization best practices (chart selection, UX/UI design for dashboards). Strong problem-solving and analytical skills. Ability to work independently as well as part of a team. Role: Software Development - Other Industry Type: Software Product Department: Engineering - Software & QA Employment Type: Full Time, Permanent Role Category: Software Development Education UG: B.Tech/B.E. in Electronics/Telecommunication, Information Technology, Computers Key Skills Skills highlighted with are preferred keyskills Power BI Bold BI Data Visualization Azure PostgreSQL Tableau SQL REST API MySQL MongoDB Oracle ETL AWS Python Rest Api, Mysql, Oracle, Python, Aws, Etl, Mongodb, Power Bi, Bold Bi, Data Visualization, Azure, Postgresql, Tableau, Sql
Posted 3 weeks ago
6.0 - 11.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Project description You will be working in a cutting edge, banking environment which is now ongoing thorough upgrade program. You will be responsible for translating business data and overall data into reusable and adjustable dashboards used by senior business managers. Responsibilities Design and develop complex T-SQL queries and stored procedures for data extraction, transformation, and reporting. Build and manage ETL workflows using SSIS to support data integration across multiple sources. Create interactive dashboards and reports using Tableau and SSRS for business performance monitoring. Develop and maintain OLAP cubes using SSAS for multidimensional data analysis. Collaborate with business and data teams to understand reporting requirements and deliver scalable BI solutions. Apply strong data warehouse architecture and modeling concepts to build efficient data storage and retrieval systems. Perform performance tuning and query optimization for large datasets and improve system responsiveness. Ensure data quality, consistency, and integrity through robust validation and testing processes. Maintain documentation for data pipelines, ETL jobs, and reporting structures. Stay updated with emerging Microsoft BI technologies and best practices to continuously improve solutions. Skills Must have At least 6 years of experience with T-SQL and SQL Server (SSIS and SSRS exposure is a must. Proficient with Tableau, preferably with at least 4 years of experience creating dashboards. Experience working with businesses and delivering dashboards to senior management. Working within Data Warehouse architecture experience is a must. Exposure to Microsoft BI. Nice to have N/A
Posted 3 weeks ago
4.0 - 8.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Date 1 Location: Bangalore, KA, IN Company Alstom We are looking for an experienced Engineer to oversee the process and use of data systems used in ALSTOM. You will discover efficient ways to organize, store and analyze data with attention to security and confidentiality. A great Engineer Data Analyst is able to fully grasp the complexity of data management. The ideal candidate will have a strong understanding of databases and data analysis procedures. You will also be tech-savvy and possess excellent troubleshooting skills. The goal is to ensure that Parts Data information flows timely and securely to and from the Orchestra Tool to organization widly interphased tools. Purpose of the role Reporting to the Engineering Data Shared Services DL, and working closely with other Digital Transformation Teams, Business Process Owners, Data Owners and end users, you will: Be responsible to ensure consistency of Master data in compliance with core business rules. Contribute in defining the data standards and data quality criteria Manage critical activities in the process Be the subject matter expert and share knowledge Responsibilities Create and enforce Standard , Specific & Design parts for effective data management Formulate techniques for quality data collection to ensure adequacy, accuracy and legitimacy of data Devise and implement efficient and secure procedures for data handling and analysis with attention to all technical aspects Support others in the daily use of data systems and ensure adherence to legal and company standards Assist with reports and data extraction when needed Monitor and analyze information and data systems and evaluate their performance to discover ways of enhancing them (new technologies, upgrades etc.) Troubleshoot data-related problems and authorize maintenance or modifications Manage all incoming data files. Continually develop data management strategies. Analyse & validate master data during rollouts. Raise incidents tickets and work closely with other IT operations teams to resolve MDM issues. Being resilient and strive towards taking the team to next level by highlighting roadblocks to management Critical Challenges Mtiers facing transformation challenges while business continuity must be maintained in Regions Complex end to end data flows with many cross-data dependencies
Posted 3 weeks ago
4.0 - 7.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Detailed Job Description: We are seeking a highly skilled Data Engineer with strong experience in Azure Data Services, SAP integration, and modern data governance tools like Unity Catalog. The ideal candidate will have a deep understanding of data extraction from enterprise systems, data movement across platforms, and performance-optimized data pipeline development. Implement and manage Unity Catalog within the data ecosystem for data governance, security, and access control. Integrate with SAP systems , including connecting to key master data tables and leveraging SAP APIs for data extraction. Develop and maintain data ingestion pipelines using APIs for real-time and batch data extraction. Work with Azure Blob Storage and Azure Synapse Analytics for staging and transformation of large datasets. Design and implement data movement strategies from Synapse to Microsoft Fabric , leveraging native and custom pipelines. Handle and transform multiple data formats including JSON, XML, and CSV . Build and maintain Azure Data Factory (ADF) pipelines , including setup of Linked Services, Datasets, and Integration Runtime configurations. Optimize pipeline performance through tuning, parallelization, and resource management. Implement robust monitoring and logging mechanisms for data pipelines to ensure reliability, traceability, and operational visibility. Skills Required: Extensive experience in data migration , including SAP to Data Lake , SQL databases , and API-based integrations . Proficient in handling diverse data formats : JSON, CSV, Parquet, XML. Strong hands-on expertise with Azure services : ADLS for scalable storage ADF for ETL/ELT pipelines Synapse for analytics and reporting Strong hands-on expertise with Databricks, Unity Catalog Strong knowledge on Synapse to Fabric data lake and pipelines
Posted 3 weeks ago
8.0 - 13.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Job description Required Skills and Qualifications: Proven experience working with BI Tools (strong background with similar BI tools like Power BI, Tableau, etc., but Bold BI preferred ). Strong experience connecting to and working with SQL Server, MySQL, PostgreSQL, Oracle, MongoDB, and REST APIs. Python programming skills for developing custom visualizations and data processing tasks. Ability to write and optimize SQL queries for data extraction and transformation. Familiarity with cloud environments (AWS, Azure, GCP) for application & database hosting and integrations. Understanding of ETL processes and data integration strategies. Familiarity with data visualization best practices (chart selection, UX/UI design for dashboards). Strong problem-solving and analytical skills. Ability to work independently as well as part of a team. Role: Software Development - Other Industry Type: Software Product Department: Engineering - Software & QA Employment Type: Full Time, Permanent Role Category: Software Development Education UG: B.Tech/B.E. in Electronics/Telecommunication, Information Technology, Computers Key Skills Skills highlighted with are preferred keyskills Power BI Bold BI Data Visualization Azure PostgreSQL Tableau SQL REST API MySQL MongoDB Oracle ETL AWS Python
Posted 3 weeks ago
2.0 - 4.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a highly motivated and experienced Data Analyst with a strong focus on dashboard development to join our growing analytics team in HSR Layout, Bangalore. The ideal candidate will be responsible for transforming raw data into insightful and actionable dashboards using Power BI. You will work closely with stakeholders across various departments to understand their data needs, design effective visualizations, and deliver data-driven insights that contribute to business success. This role requires a strong analytical mindset, proficiency in data manipulation and visualization techniques, and excellent communication skills. Key Responsibilities: Data Collection and Preparation: Gather and clean data from various sources (e.g., databases, spreadsheets, APIs). Perform data profiling, cleaning, and transformation to ensure data quality and integrity. Develop and maintain data pipelines for efficient data extraction, loading, and transformation ETL) processes. Collaborate with data engineers to optimize data models for reporting and analysis. Dashboard Design and Development: Understand business requirements and translate them into effective and user-friendly dashboards in Power BI. Design and develop interactive dashboards and reports with visually appealing and insightful visualizations (charts, graphs, tables, etc.). Implement data filtering, drill-down capabilities, and other interactive features to enhance user experience. Optimize dashboard performance for speed and efficiency. Ensure consistency in design and branding across all dashboards. Data Analysis and Interpretation: Conduct thorough data analysis to identify trends, patterns, and anomalies. Interpret data and provide meaningful insights to stakeholders. Develop and apply statistical techniques and methodologies for data analysis. Document analysis findings and present them clearly and concisely. Collaboration and Communication: Work closely with business stakeholders to understand their reporting and analytical needs. Effectively communicate technical concepts to both technical and non-technical audiences. Participate in meetings and presentations to share insights and dashboard demonstrations. Collaborate with other analysts and team members to share best practices and knowledge. Maintenance and Support : Monitor and maintain existing dashboards to ensure data accuracy and functionality. Troubleshoot and resolve any issues related to data or dashboards. Implement updates and enhancements to existing dashboards based on evolving business needs. Stay up-to-date with the latest features and capabilities of Power BI. Qualifications and Skills : Bachelor s degree in Computer Science, Statistics, Mathematics, Business Analytics, or a related field. 2-4 years of proven experience as a Data Analyst with a strong focus on dashboard development. Expertise in Microsoft Power BI. Solid understanding of data visualization principles and best practices. Excellent analytical and problem-solving skills with the ability to interpret complex data. Strong communication and presentation skills, with the ability to explain technical concepts to non-technical stakeholders. Ability to work independently and as part of a collaborative team. Strong attention to detail and a commitment to data accuracy.
Posted 3 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Gurugram
Work from Office
Nutrabay is seeking a highly analytical and detail-oriented Data Analyst to join our team. The ideal candidate will have a strong background in data analysis, visualization, and building robust analytics platforms to support business growth across product, marketing, and engineering teams. You should apply if you have: Proven experience in building analytics platforms from scratch, preferably in a product-based or e-commerce company. Strong hands-on experience in Power BI (DAX, Power Query, performance optimization). Advanced SQL skills, including complex query building and optimization. Expertise in ETL pipelines , data modeling, and data warehousing. Experience working with cloud data platforms like AWS Redshift and Google BigQuery. The ability to interpret and translate large datasets into actionable business insights. A collaborative mindset and excellent communication skills to present data-driven narratives. A strong sense of ownership and curiosity in solving business problems through data. You should not apply if you: Do not have practical experience with data visualization tools like Power BI. Are unfamiliar with writing and optimizing SQL queries. Have never worked on ETL or cloud-based data solutions (Redshift, BigQuery, etc.). Struggle with interpreting business problems into analytical solutions. Are uncomfortable in a high-ownership, fast-paced, product-led environment. Skills Required: Power BI (including DAX, Power Query, report optimization) SQL (data extraction, transformation, performance tuning) ETL Process Design and Data Warehousing AWS Redshift / Google BigQuery Python (Preferred, for automation and data wrangling) Data Governance & Security Business Intelligence & Analytical Storytelling Cross-functional stakeholder communication Data-driven decision support and impact measurement What will you do? Build and manage a centralized analytics platform from the ground up. Create insightful dashboards and reports using Power BI to drive strategic decisions. Design and implement ETL processes to integrate data from multiple sources. Ensure the accuracy, completeness, and reliability of all reporting systems. Collaborate with product managers, engineers, marketing, and leadership to define KPIs and data strategies. Conduct deep-dive investigations into customer behavior, performance metrics, and market trends. Develop internal tools and models to track business health and opportunity areas. Drive initiatives to enhance analytics adoption across teams. Own the data governance practices to ensure compliance and data security. Provide thought leadership and mentor junior analysts in the team. Work Experience: 2 4 years of experience in data analytics, business intelligence, or data engineering roles. Prior experience in a product-based company or high-growth e-commerce environment is strongly preferred. Working Days: Monday Friday Location: Golf Course Road, Gurugram, Haryana (Work from Office) Perks: Opportunity to build a complete analytics infrastructure from scratch. Cross-functional exposure to tech, marketing, and product. Freedom to implement ideas and drive measurable impact. A collaborative work environment focused on growth and innovation. High learning and personal growth opportunity in a rapidly growing D2C company. Why Nutrabay: We believe in an open, intellectually honest culture where everyone is given the autonomy to contribute and do their life s best work. As a part of the dynamic team at Nutrabay, you will have a chance to learn new things, solve new problems, build your competence and be a part of an innovative marketing-and-tech startup that s revolutionising the health industry. Working with Nutrabay can be fun, and a place of a unique growth opportunity. Here you will learn how to maximise the potential of your available resources. You will get the opportunity to do work that helps you master a variety of transferable skills, or skills that are relevant across roles and departments. You will be feeling appreciated and valued for the work you delivered. We are creating a unique company culture that embodies respect and honesty that will create more loyal employees than a company that simply shells out cash. We trust our employees and their voice and ask for their opinions on important business issues. About Nutrabay: Nutrabay is the largest health & nutrition store in India. Our vision is to keep growing, having a sustainable business model and continue to be the market leader in this segment by launching many innovative products. We are proud to have served over 1 million customers uptill now and our family is constantly growing. We have built a complex and high converting eCommerce system and our monthly traffic has grown to a million. We are looking to build a visionary and agile team to help fuel our growth and contribute towards further advancing the continuously evolving product. Funding: We raised $5 Million in a Series A funding
Posted 3 weeks ago
3.0 - 4.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive datadriven decisionmaking. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Why PWC & Summary Take ownership of Tableau dashboard designing/reporting and create world class visualization which would be published to business stakeholders. Partner with business teams to gather requirements and relevant data and interpret the business needs to identify KPIs. Expertise in BI visualization and ability to come up with industry leading and insightful stories from data s Demonstrate strong visualization skills using Tableau and should have experience of at least 34 years in working on Tableau/other BI tools with large scale and high value implementation. Knowledge of basic Tableau chart types and concepts like actions, filters, reference lines, calculations, parameters etc. Invent creative and innovative ways to answer key business questions by leveraging existing data. Design and implement proof of concept solutions and create advanced Tableau Visualizations. End to end Tableau designer expertise, from Tableau data model publication, interactive dashboard, design/development to deploy/maintenance. Advanced Tableau development including data model creation, customize dimensions and advanced statistical calculations. Mandatory skill sets Clear Understanding of Data Warehousing Concepts. Should Work with a data engineering team closely to perform data extraction and data transformation processes to create the datasets. Experience in creating and publishing reports on both web and mobile layout. Able to Perform Unit Testing like functionality testing and Data Validation. Preferred skill sets Report Performance Optimization and Troubleshooting. Clear Understanding of UI and UX designing. Hands on Working Experience in SQL to write the queries. Very good communication skills must be able to discuss the requirements effectively with business owners. Manage documentation of dashboards and ensure the junior resources are trained on tableau to build more developers. Years of experience required 6 to 10 yrs of Exp in Power BI. Education qualification B.Tech / M.Tech (Computer Science,Mechanical, Mathematics & Scientific Computing etc.) Education Degrees/Field of Study required Master of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Data Validation, Data Warehousing (DW) Structured Query Language (SQL), Troubleshooting No
Posted 3 weeks ago
2.0 - 5.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Do Research, design, develop, and modify computer vision and machine learning. algorithms and models, leveraging experience with technologies such as Caffe, Torch, or TensorFlow. - Shape product strategy for highly contextualized applied ML/AI solutions by engaging with customers, solution teams, discovery workshops and prototyping initiatives. - Help build a high-impact ML/AI team by supporting recruitment, training and development of team members. - Serve as evangelist by engaging in the broader ML/AI community through research, speaking/teaching, formal collaborations and/or other channels. Knowledge & Abilities: - Designing integrations of and tuning machine learning & computer vision algorithms - Research and prototype techniques and algorithms for object detection and recognition - Convolutional neural networks (CNN) for performing image classification and object detection. - Familiarity with Embedded Vision Processing systems - Open source tools & platforms - Statistical Modeling, Data Extraction, Analysis, - Construct, train, evaluate and tune neural networks Mandatory Skills: One or more of the following: Java, C++, Python Deep Learning frameworks such as Caffe OR Torch OR TensorFlow, and image/video vision library like OpenCV, Clarafai, Google Cloud Vision etc Supervised & Unsupervised LearningDeveloped feature learning, text mining, and prediction models (e.g., deep learning, collaborative filtering, SVM, and random forest) on big data computation platform (Hadoop, Spark, HIVE, and Tableau) *One or more of the following: Tableau, Hadoop, Spark, HBase, Kafka Experience:- 2-5 years of work or educational experience in Machine Learning or Artificial Intelligence - Creation and application of Machine Learning algorithms to a variety of real-world problems with large datasets. - Building scalable machine learning systems and data-driven products working with cross functional teams - Working w/ cloud services like AWS, Microsoft, IBM, and Google Cloud- Working w/ one or more of the following: Natural Language Processing, text understanding, classification, pattern recognition, recommendation systems, targeting systems, ranking systems or similar Nice to Have: - Contribution to research communities and/or efforts, including publishing papers at conferences such as NIPS, ICML, ACL, CVPR, etc. Education: BA/BS (advanced degree preferable) in Computer Science, Engineering or related technical field or equivalent practical experience Wipro is an Equal Employment Opportunity employer and makes all employment and employment-related decisions without regard to a person's race, sex, national origin, ancestry, disability, sexual orientation, or any other status protected by applicable law Product and Services Sales Manager Mandatory Skills: Generative AI. Experience:3-5 Years.
Posted 3 weeks ago
4.0 - 5.0 years
3 - 7 Lacs
Vadodara
Work from Office
Job Title : Integration Specialist - LIMS Grade G11A Department/Group: Global IT/Projects Location: Baroda Job Summary Integration Specialist - LIMS, will be responsible for the integrating various lab instruments/software with LIMS application for successfully establishing paperless environment in QC/R&D Labs. This role requires a deep understanding of laboratory workflows, Instrument connectivity techniques, strong technical skills, and the ability to work closely with cross-functional teams to ensure the LIMS meets the needs of the organization Job Description Role and Responsibilities Support in QC/R&D lab instrument integration with LIMS application, at the time of LIMS implementation at various locations of SUN Pharma. Develop and execute test scripts, and relevant documentations required as part of validation activity. Configure LIMS software to meet the specific needs during Instrument integration activity. Provide training and support to end-users, ensuring they are proficient in using the LIMS. Troubleshoot and resolve any issues related to LIMS performance, functionality, and integration with other systems/instruments. Maintain comprehensive documentation of the LIMS implementation process, including user guides and technical manuals. Shall be aware about documentation of LIMS application as per GMP. Shall be able to create and manage documentation of LIMS application to ensure system is in compliance state. Qualifications and Preference Qualifications Bachelors degree in Information Technology, or Computer Science or related field. Minimum of 4-5 years of experience in LIMS implementation & Instrument Integration activity, preferably in a pharmaceutical or biotech environment. Strong understanding of laboratory processes and workflows. Deep understanding of Lab Software functionality such as Empower, LabSolutions, Chromeleon. Strong command on Data extraction / connectivity methodologies techniques for port based instruments such as balances, pH etc. Proficiency in LIMS software (Expertise on CaliberLIMS is preferred) Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Experience with regulatory compliance requirements (e. g. , FDA, GMP, GLP). Preferred Qualifications Advanced degree in Life Sciences, Information Technology, or a related field. Familiarity with laboratory instruments and their integration with LIMS Experience with Caliber LIMS is preferred. Good documentation skills to create and manage GxP documents.
Posted 3 weeks ago
2.0 - 5.0 years
2 - 6 Lacs
Gurugram
Work from Office
About Us At SBI Card, the motto Make Life Simple inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. What s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose This role is responsible for day-to-day operations of the function including catering to data service requests, data observability, Analytics, insight generation and Dashboard/MIS requirements while owning the business data within the SBI Card Data Lake. Role Accountability This role will involve development and implementation of BI solutions that support data-driven decision-making for our credit card business. Candidate with a strong background in data analytics and a decent understanding of the financial services industry. Key Responsibilities Data Analysis & Reporting: Collect, analyze, and interpret large datasets to generate actionable insights related to credit card products, customer behavior, and financial performance. Develop and maintain regular reports, dashboards, and key performance indicators (KPIs) that track and analyze business performance. Present data insights to management and relevant teams, ensuring clear and effective communication of findings. Data Visualization & Insights Generation: Design and build interactive dashboards and visualizations using BI tools (e. g. , Power BI, Tableau) to enhance data accessibility for stakeholders. Perform ad-hoc analysis to identify business opportunities, trends, and potential risks within the credit card portfolio. Work with the Manager and other stakeholders to identify areas for improvement and provide data-driven recommendations. Data Management & Integrity: Ensure data quality and accuracy by following best practices in data governance, validation, and cleansing. Assist in maintaining and enhancing the company s data warehouse and reporting structures, ensuring that all relevant data is up-to-date and accessible. Identify and troubleshoot data discrepancies or issues and provide solutions to ensure data integrity. Collaboration with Cross-Functional Teams: Collaborate with various departments, to understand business needs and provide actionable insights. Support in implementing new BI tools, processes, and technologies across the organization. Partner with business teams to define key metrics and develop reporting structures for ongoing performance monitoring. Continuous Improvement & Learning: Stay updated on the latest trends, tools, and technologies in business intelligence and analytics. Identify opportunities to automate or streamline data collection, analysis, and reporting processes. Contribute to ongoing process improvements and enhance the BI team s capabilities. Measures of Success Deliver data projects MIS, Reports and dashboards on Time and accurately to drive business decision making Deliver Actionable insights to business for decision making Deliver on data extraction and other service tickets within SLA Technical Skills / Experience / Certifications Proficiency in BI tools (e. g. , Power BI, Tableau, QlikView) and data analysis tools (e. g. , SQL, Python, R). Experience in data visualization and storytelling to communicate insights effectively. Strong knowledge of data warehousing, ETL processes, and database management is a plus Strong understanding of financial metrics and KPIs related to credit card businesses (e. g. , delinquency rates, card utilization, profitability). Strategic & Lateral thinking and capability to come up with new ideas Proficiency with statistical and data software languages Competencies critical to the role Ability to multi-task, work with cross-functional teams Communication & Presentation skills. Analytical Skills and an eye for detail Strategic & Lateral thinking and capability to come up with new ideas Commitment towards continuous learning Strong communication skills, with the ability to present findings to both technical and non-technical stakeholders. Qualification Graduate or Postgraduate in Computer Science, Data Science, Statistics, Data Analytics or related fields. Preferred Industry BFSI
Posted 3 weeks ago
1.0 - 7.0 years
20 - 27 Lacs
Mumbai
Work from Office
We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Banks Risk Central Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Integrate data from various firm sources into big data warehouse Investigate data issues, provide support on data issues. Develop automation for data extraction. Design and tune schema for data landed on platform Partner with information modelling teams on firm wide logical data models. Serve as the primary subject matter expert (SME) for data in the analytics platform Develop data quality rules and controls for data Analyze and solve query performance bottlenecks in Cloud based warehouses like Redshift and AWS Glue Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in big data technologies - Apache Spark, Hadoop and analytics. Hands on coding experience in Java/Python Experience in designing & developing using Redshift Strong CS fundamentals, data structures, algorithms with good understanding of big data Experience with AWS application development including services such as Lambda, Glue, ECS/EKS Excellent communication skills are a must for this position Experience with Unix/Linux and shell scripting, Redshift, Hive. Preferred qualifications, capabilities, and skills Good understanding of data modelling challenges with big data Good understanding of Financial data especially in front office investment banking is a major plus Ability to code in Apache Spark using Scala is an added advantage We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Banks Risk Central Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Integrate data from various firm sources into big data warehouse Investigate data issues, provide support on data issues. Develop automation for data extraction. Design and tune schema for data landed on platform Partner with information modelling teams on firm wide logical data models. Serve as the primary subject matter expert (SME) for data in the analytics platform Develop data quality rules and controls for data Analyze and solve query performance bottlenecks in Cloud based warehouses like Redshift and AWS Glue Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in big data technologies - Apache Spark, Hadoop and analytics. Hands on coding experience in Java/Python Experience in designing & developing using Redshift Strong CS fundamentals, data structures, algorithms with good understanding of big data Experience with AWS application development including services such as Lambda, Glue, ECS/EKS Excellent communication skills are a must for this position Experience with Unix/Linux and shell scripting, Redshift, Hive. Preferred qualifications, capabilities, and skills Good understanding of data modelling challenges with big data Good understanding of Financial data especially in front office investment banking is a major plus Ability to code in Apache Spark using Scala is an added advantage
Posted 3 weeks ago
4.0 - 6.0 years
8 Lacs
Pune, Chennai, Bengaluru
Work from Office
Experience Range in Required Skills: Windows Powershell~MySQL Essential Skills: Good information and sound knowledge in PowerShell and SQL. Key Responsibilities: Develop, maintain, and troubleshoot scripts using Windows PowerShell for automation and system tasks. Write, optimize, and manage MySQL/SQL queries for data extraction, manipulation, and reporting. Collaborate with cross-functional teams to understand requirements and deliver efficient automated solutions. Ensure high performance, security, and reliability of developed scripts and database procedures. Maintain technical documentation for scripts, databases, and processes. Continuously monitor and enhance system automation and database performance. Location-Chennai,Pune,Bengaluru,Hyderabad
Posted 3 weeks ago
2.0 - 4.0 years
2 - 6 Lacs
Chennai
Work from Office
- Expertise in computer hardware configuration and customization - Good Network and connectivity protocol skills - Expertise is sensors and data extraction from devices / sensors Required Candidate profile - Knowledge of android and iOS bas apps on devices - IoT and related are explore is preferred. - Good communication skills
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough