Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
5 - 10 Lacs
Nagpur, Hyderabad, Pune
Work from Office
As a Senior Technical Consultant you will participate in all aspects of the software development lifecycle which includes estimating, technical design, implementation, documentation, testing, deployment and support of application developed for our clients. As a member working in a team environment you will take direction from solution architects and Leads on development activities. Perficient is always looking for the best and brightest talent and we need you! we're a quickly-growing, global digital consulting leader, and we're transforming the world s largest enterprises and biggest brands. you'll work with the latest technologies, expand your skills, and become a part of our global community of talented, diverse, and knowledgeable colleagues. 4+ years of experience in monitoring, troubleshooting Proven experience with AWS ETL services such as Glue, Lambda, Step Functions Strong understanding of data warehousing concepts and data modeling principles. Experience with SQL and scripting languages like Python or Bash for data manipulation and automation. Experience with monitoring tools. Excellent communication, collaboration, leadership, and problem-solving skills. ? Implement secure and scalable data pipelines on AWS utilizing services like Glue, Lambda, Step Functions, and Kinesis. ? Should have worked on monitoring tools and support the developers with scalable solutions. Should be diligent and proactive on resolving on the fly issues and escalate to team as needed. ? Optimize data pipelines for performance, scalability, and cost-effectiveness. ? Handson knowledge on MS SQL, AWS Athena ? Implement data quality checks and monitoring to ensure data integrity throughout the ETL process. ? Stay up-to-date on the latest advancements in AWS data services and ETL best practices. ? Troubleshoot and resolve complex data pipeline issues. ? Willing to work in rotating shifts
Posted 1 week ago
5.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Design, develop, and maintain end-to-end data pipelines on AWS, utilizing serverless architecture. Implement data ingestion, validation, transformation procedures using AWS services such as Lambda, Glue, Kinesis, SNS, SQS, and CloudFormation. Write orchestration tasks within Apache Airflow. Develop and execute data quality checks using Great Expectations to ensure data integrity and reliability. Collaborate with other teams to understand mission objectives and translate them into data pipeline requirements. Utilize PySpark for complex data processing tasks within AWS Glue jobs. Qualifications: Bachelors degree in Computer Science, Engineering, or related field. Strong proficiency in Python programming language. Hands-on experience with AWS services. Experience with serverless architecture and Infrastructure as Code (IaC) using AWS CDK. Proficiency in Apache Airflow for orchestration of data pipelines. Familiarity with data quality assurance techniques and tools, preferably Great Expectations. Experience with SQL for data manipulation and querying. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Experience with Data Lakehouse, dbt, Apache Hudi data format is a plus. Mandatory Skills Python, AWS , Infrastructure as Code (IaC) using AWS CDK, Apache Airflow, SQL, Data Lakehouse, dbt, Apache Hudi VENDOR PROPOSED RATE (as per ECMS system) 12000 INR/day Work Location Anywhere in India
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Willing to work from office (Monday to Friday) in US-EST shift (Night shift) and suitable candidate will be called to office for F2F interview. EXPERIENCE: 2+ Year in international outbound calling LANGUAGE (Fluent): English Language Role Description The Associate is a highly motivated, excellent communicator and personable professional that focuses on recruiting market research participants via phone and carry out cold calling to engage new respondents to complete M360 Research projects. The Associate provides high quality customer service to panel members and other research participants by answering questions and resolving enquiries regarding registration, accounts, study participation, compensation, and more. Duties and Responsibilities Project Recruitment: Recruit M360 Research panellists via phone and carry out cold calling to recruit respondents on allocated quantitative and qualitative projects Carry out online desk research of phone numbers of healthcare professionals and call them to recruit them to participate in market research studies Ensure project details are clearly communicated to participants when contacting them over the phone Inform the line manager of any recruitment issues, delays and foreseeable problems that can affect the successful delivery of the project Provide insightful and relevant feedback on projects feasibility based on gathered market intelligence upon talking to respondents over the phone Complete phone screening of recruited respondents for telephone and in-facility interviews Ensure qualitative respondents are scheduled and confirmed over the phone Ensure confirmation letters and consent forms are sent and complete follow up calls if needed to chase on materials Ensure that daily number of calls and strike rate targets are achieved Strictly follow phone quality communication parameters and guidelines As part of job responsibilities, you are required to comply with ISO 20252:2019 and ISO 27001 standards . Willing to work in US EST shift, the role requires you to support US. (Shift time 6:00 PM to 3:00 AM IST). Panel Engagement Call M360 Research registered respondents who have not confirmed their email account to complete the onboarding process Call inactive M360 Research respondents and invite them to reactivate their M360 account Panel Support: Provide high-quality professional support to M360 Research members via telephone and email / support ticket communications Master and work across multiple systems to investigate, troubleshoot and handle enquiries and complaints and provide appropriate solutions and alternatives Effectively work with our US and EU Operations / Project Management team members to resolve project specific issues and acting on behalf of the user while balancing user advocacy and company profitability Handle all enquiries according to company policy and expectations in regard to outcomes, time to resolution, and communication standards Complete the verification process for newly registered panelists and carry out data quality checks for existing panelists Build and maintain strong relationships with the panel members through timely and professional communication, quality problem solving and creative thinking Provide input and assist with updating communication procedures, guidelines, and enhancements to the system and processes Complete panel verifications and provide assistance on market research project set up Experience, skills and qualifications Fluent in English 1+ year of experience as a Tel caller with US Process or International Call Centre Excellent interpersonal and communication skills - both verbal and written Have a strong comfort level of being on the phone Be able to work as part of a team and show flexibility in the tasks they are asked to perform Independently motivated and inspired by working in a dynamic environment Comfortable with change, with the ability to seek opportunity in uncertain conditions Analytical and a strategic thinker Ensure 100% accuracy and demonstrate excellent attention to detail Strong organizational skills and the ability to multitask Comfortable to work in night shift (US EST time Zone) Willing work from Office. Visit our company website before the interview www.m360research.com Qualifications Any Graduate/Under-Graduate (minimum 12th pass)
Posted 1 week ago
5.0 - 10.0 years
35 - 40 Lacs
Mumbai
Work from Office
We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Data Platform Engineering Lead at JPMorgan Chase within Asset and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Lead the design, development, and implementation of scalable data pipelines and ETL batches using Python/PySpark on AWS. Execute standard software solutions, design, development, and technical troubleshooting Use infrastructure as code to build applications to orchestrate and monitor data pipelines, create and manage on-demand compute resources on cloud programmatically, create frameworks to ingest and distribute data at scale. Manage and mentor a team of data engineers, providing guidance and support to ensure successful product delivery and support. Collaborate proactively with stakeholders, users and technology teams to understand business/technical requirements and translate them into technical solutions. Optimize and maintain data infrastructure on cloud platform, ensuring scalability, reliability, and performance. Implement data governance and best practices to ensure data quality and compliance with organizational standards. Monitor and troubleshoot application and data pipelines, identifying and resolving issues in a timely manner. Stay up-to-date with emerging technologies and industry trends to drive innovation and continuous improvement. Add to team culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software development and data engineering, with demonstrable hands-on experience in Python and PySpark. Proven experience with cloud platforms such as AWS, Azure, or Google Cloud. Good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts. Experience or good knowledge of cloud native ETL platforms like Snowflake and/or Databricks. Experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3. Proven experience with efficient Cloud DevOps practices and CI/CD tools like Jenkins/Gitlab, for data engineering platforms. Good knowledge of SQL and NoSQL databases, including performance tuning and optimization. Experience with declarative infra provisioning tools like Terraform, Ansible or CloudFormation. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively. Experience in leading and managing a team/pod of engineers, with a proven track record of successful project delivery. Preferred qualifications, capabilities, and skills Knowledge of machine learning model lifecycle, language models and cloud-native MLOps pipelines and frameworks is a plus. Familiarity with data visualization tools and data integration patterns.
Posted 1 week ago
4.0 - 9.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Pearson : At Pearson, we\u2019re committed to a world that always learning and to our talented team who makes it all possible From bringing lectures vividly to life to turning textbooks into laptop lessons, we are always re-examining the way people learn best, whether it one child in our own backyard or an education community across the globe We are bold thinkers and standout innovators who motivate each other to explore new frontiers in an environment that supports and inspires us to always be better Background Information : Shared Services, a captive unit, based out in Noida enablespositive changes inperformanceand stakeholderengagement througha centralized operatingmodel Shared services is a global function supporting Pearson Higher Education As a team, we manage a variety of data processes to ensure that data is valid, accurate and compliant with governed rules We also provide solutioning to business teams if they require changes in the database or in the functionality of any tool As content continues to proliferate across multiple emerging digitalplatforms, our team provides resources to enable scalability and costcontainment We also facilitatecollaboration betweenbusiness and technology who contribute to the products Role Description: We are seeking a detail-oriented and analytical professional to join our team in the role of Associate, Data Operations This role is responsible for ensuring the accuracy, consistency, and integrity of data across systems and workflows The individual will support data lifecycle management, execute operational processes, and collaborate with cross-functional teams to drive data quality, compliance, and timely delivery Key Responsibilities:Manage end-to-end data entry, updates, and maintenance across internal platforms and systems Monitor data quality, identify anomalies or discrepancies, and take corrective actions as needed Support the creation, tracking, and maintenance of item/product/master data or other key business datasets Partner with cross-functional teams to ensure timely and accurate data inputs aligned with business rules and timelines Document and optimize data operational processes to enhance efficiency and consistency Conduct routine audits and validation checks to ensure data compliance with internal standards and policies Assist in onboarding new tools or systems related to data operations, including testing and training Education, Qualifications & Functional CompetenciesBachelor degree in Business, Information Systems, Data Science, or related field 4years of experience in data operations, data management, or related roles Strong proficiency in ExcelExperience with data entry and governanceStrong attention to detail and a commitment to data accuracy Excellent organizational and communication skills Ability to work independently as well as part of a team in a fast-paced environment Core Behavioural Competencies:Essential:Ability to work collaboratively as a teamFlexible to adapt changes and a strong customer focusGood personnel management skills with ability to understand business processes and execute routine work Should have flexibility to work with international teams where there are multiple time zones to balanceConfident, enthusiastic, curious and result drivenDesired:Flexible to change and adapt new ways of workingShould be able to work with diverse stakeholders of varied cultural backgrounds1145110Job: Data EngineeringJob Family: TECHNOLOGY
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Summary: We are seeking a passionate and experienced Big Data Engineer with strong hands-on expertise in Scala, Apache Spark, Kafka, Spark Streaming, and SQL . You will be responsible for building and maintaining real-time and batch data processing pipelines that support our data-driven decision-making across business functions. Key Responsibilities: Design and implement scalable and high-performance data processing pipelines using Apache Spark (core and streaming). Develop clean and efficient Scala code to handle large-scale data transformations. Integrate Apache Kafka for real-time data ingestion and streaming workflows. Write optimized SQL queries for data transformation and analytics. Collaborate with data scientists, analysts, and other engineers to deliver robust data solutions. Ensure data quality, reliability, and consistency across all platforms. Optimize system performance and troubleshoot data-related issues.
Posted 1 week ago
0.0 - 5.0 years
2 - 7 Lacs
Pune
Work from Office
Its fun to work at a company where people truly believe in what they are doing! Job Description: Job Summary: The Operations Analyst role is to provide technical support for the full lifecycle of the electronic discovery reference model (EDRM) including ingestion of data, quality control, document production and document review projects. The position will require attention to detail, multi-tasking, analytical skills, as well as someone who works well in a team. The candidate must be able to work under the pressure of strict deadlines on multiple projects in a fast-paced environment. Essential Job Responsibilities Utilize proprietary and 3rd party eDiscovery software applications for electronic discovery and data recovery processes. Load, Process and Search client data in many different file formats. Conducting relevant searches of electronic data using proprietary tools. Work closely with team members to troubleshoot data issues (prior to escalation to operations senior management and/or IT/Development), research software and/or techniques to solve problems, and carry out complex data analysis tasks. Providing end user and technical documentation and training for applications supported. Communicate and collaborate with other company departments. Generate reports from various database platforms for senior management. Generating written status reports to clients, managers, and project managers. Working closely with internal departments on streamlining processes and development of proprietary tools Qualifications & Certifications Solid understanding of Windows and all MS Office applications is required. Basic UNIX skills, understanding of hardware, networking, and delimited files would be an advantage. Experience with database applications and knowledge of litigation support software is desirable. Strong analytical and problem-solving skills are essential for this role. Demonstrated ability to work in a team environment, follow detailed instructions and meet established deadlines. A self-starter with ability to visualize data and software behavior and coordinate the two. Fluency in English (verbal and written) is required. Bachelor s degree or final year student, preferably in computer/technical or legal field or equivalent combination of education and/or experience required. If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us!
Posted 1 week ago
9.0 - 11.0 years
35 - 40 Lacs
Gurugram
Work from Office
Some careers have more impact than others. We are currently seeking an experienced professional to join our team in the role of Assistant Vice President - HBUK Regulatory Reporting Principal responsibilities Production and finalization of (NCCR) Non counterparty credit risk, Risk weighted asset (RWA) in Finance on the cloud ( FOTC) and analyzing period on period Risk weighted asset and Excess EL variances and preparing the consolidated commentary to support the internal review and sign off process. Production and finalization of Capital and leverage exposures for PRA reporting. Reconciliation between various systems/ templates/ cross functions. Submission of PRA101, COREP own funds, Leverage, STDF (Actuals), Pillar 2 on periodic basis to PRA. Also, required to contribute to external disclosures like Pillar3 and Annual reports and accounts. Preparation of senior management Review & challenge (sign off packs) to obtain sign off from business heads (CFO& CRO) for Regulatory/external reporting purposes. Supporting ad-hoc analysis and ensuring timely submission of management reports and statutory reports to different stakeholders. Understanding of control frameworks and how to implement effective controls in practice. Continuous efforts required ensuing adherence to controls framework and document the limitations, controls weakness in more timely and effective manner. Collaboratively working with various stakeholders for remediating the data quality issues, system issues, performing UAT s timely basis. Also, working towards automating/smoothening the reporting activities. Assists the Head of Reporting Operations in developing a deep pool of talent with understanding of technical financial and regulatory pronouncements. Provide understanding of how technical accounting and reporting translates into operational processes. Promote a culture of continuous innovation, challenge the business on approach and apply knowledge of relevant latest developments. Adopt new ways of working such as Agile, particularly in respect of change activities, and encourage the adoption of new technology within the reporting teams. Working closely with various stakeholders like Data Operations, Finance Change delivery and Accounting and Regulatory Policy to understand, plan and deliver change initiatives including new reporting requirements. Requirements Qualified CA (Chartered Accountant), MBA (Finance) or Engineering degree with interest in Financial Services Understanding of how to review large data sets and draw out strategic insights as well interpret data and identify trends/anomalies particularly in the context of business, macro-economic or Regulatory drivers, as well as the ability to work quickly & accurately to tight deadlines. Knowledge and/or interest in Asset Liability Management, FSA Reporting, and relevant regulatory changes in a global environment advantageous Understanding of control frameworks and how to implement effective controls in practice. Working knowledge of Banking systems and understanding of relevant regulatory reporting framework preferred but not essential Good understanding of financial products and general accounting principles, and more generically the banking business, balance sheet and income statement preferred but not essential Strong attention to detail and being solution oriented. Excellent working knowledge of MS related products i.e. PowerPoint, Excel.
Posted 1 week ago
0.0 - 5.0 years
14 - 16 Lacs
Bengaluru
Work from Office
4 A Data Analyst is responsible for gathering, analyzing and problem solving as it relates to data, types of data, and relationships among data elements within a business system or IT system and with business domain expertise. A Data Analyst provides expertise on how Business workflows map to data, and how data can be integrated to build reusable data products. A Data analyst will serve as a subject matter expert for delivery teams. The Data Analyst will support the analysis and visualization of data to provide valuable insights that drive business decisions. This role requires a strong understanding of data processing, reporting, and visualization tools, as well as a good working knowledge of Microsoft Azure Services and Power Platform. The ideal candidate will have hands-on experience with data modeling, creating dashboards, and leveraging cloud technologies for data management. Key responsibilities: Analyze data to extract meaningful insights, trends, and patterns that can support decision-making across the organization. Develop and maintain dashboards and visualizations using Power BI, Spotfire, and Tableau to present data insights in an easy-to-understand format for stakeholders. Work with Azure Synapse Analytics, Azure Data Lake, and Azure SQL Database to store, transform, and retrieve data efficiently. Assist in data preparation, cleaning, and integration activities for analysis, ensuring high data quality. Collaborate with data engineers, business stakeholders, and IT teams to understand data requirements and contribute to the delivery of data-driven solutions. Leverage Power Platform tools to automate tasks, create workflows, and enhance data analysis processes. Provide support for ongoing analysis and maintain documentation for data processes, reports, and dashboards. Required Qualifications: Experience: Overall 2-4 years of experience with 2 years of proven experience in data analysis, data visualization, or related fields. Bachelors degree in Data Analytics, Computer Science, Statistics, or a related field (or equivalent experience). 0-5 years experience Technical Skills: Proficiency in Microsoft Azure services, including Azure Synapse Analytics, Azure Data Lake, and Azure SQL Database. Experience in data visualization tools such as Power BI, Spotfire, and Tableau. Strong skills in data manipulation and transformation using SQL. Familiarity with the Microsoft Power Platform, including Power Apps, Power Automate, and Power BI. Proficiency in data visualization tools such as Power BI, Spotfire, Tableau.Data modeling techniques, data fluency and governance. Proficiency in Oil & gas workflows and data types Data Analysis: Understanding of data analysis concepts and experience in data modeling and building insightful reports. Problem Solving: Strong analytical and problem-solving skills, with attention to detail. Communication Skills: Ability to communicate complex data findings in a clear and concise manner to non-technical stakeholders. Team Collaboration: Ability to work effectively as part of a team, sharing knowledge, and supporting other team members. Excellent communication and collaboration skills with demonstrated ability to build trusted working relationships with remote peers and stakeholders (internal & external parties) across global teams and time zones. Strong knowledge of Microsoft Azure Services & Power Platform, including Azure Synapse Analytics, Azure Data Lake, and Azure SQL Database. Experience with SQL and data querying languages. Experience with Python for data wrangling. Strong analytical and problem-solving skills with attention to detail and accuracy Fundamental knowledge of Information Risk Management (IRM). Preferred Qualifications: Microsoft Power BI Data Analyst certification (PL-300) Ability to discover and prepare datasets into clean, understandable, usable, datasets for data analysis. Cloud Certifications: Certifications in Microsoft Azure (e.g., Azure Data Fundamentals, Azure Data Engineer Associate) are preferred. Experience with ETL Tools: Exposure to ETL tools and data integration techniques. Advanced Analytics: Experience or knowledge of machine learning or advanced analytics is a plus. Scripting Languages: Basic understanding of scripting languages such as Python or R for data analysis. Chevron participates in E-Verify in certain locations as required by law.
Posted 1 week ago
4.0 - 8.0 years
3 - 7 Lacs
Noida, Bengaluru
Work from Office
Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the Role Business analyst focuses on data, statistical analysis and reporting to help investigate and analyze business performance, provide insights, and drive recommendations to improve performance. Expectations/ : 1. Drive business insights from data with a focus on driving business level metrics. 2. Ability to interact and convince business stakeholders. 3. Developing insightful analysis about business and their strategic and operational implications. 4. Partner with stakeholders at all levels to establish current and ongoing data support and reporting needs. 5. Analyze data from multiple angles, looking for trends that highlight areas of concerns or opportunities. 6. Design, create and deliver data reports, dashboards, extract and/or deliver presentations to strategic questions. 7. Identifying data needs and driving data quality improvement projects. Key Skills Required: 1. Ideally have 2-5 years experience working on data analytics and business intelligence. Candidates from b2c consumer internet product companies are preferred. 2. Proven work experience on MS Excel, Google analytics, SQL, Data Studio, any BI Tool, business analyst or similar role. 3. Should be comfortable working in a fast-changing environment and ambiguous. 4. Critical thinking and very detail oriented. 5. In-depth understanding of datasets, data and business understanding. 6. Capable of demonstrating good business judgement. Education Applicants must have an engineering academic background with specialization in data science . Why join us : We aim at bringing half a billion Indians into the mainstream economy, and everyone working here is striving to achieve that goal. Our success is rooted in our people’s collective energy and unwavering focus on the customers, and that’s how it will always be. We are the largest merchant acquirer in India. Compensation If you are the right fit, we believe in creating wealth for you With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It is your opportunity to be a part of the story! Location - Noida, Uttar Pradesh,Bangalore, Karnataka
Posted 1 week ago
7.0 - 9.0 years
9 - 12 Lacs
Chennai
Work from Office
Position Purpose As a member of WM IT DCIO team, the candidate will work on WM Data Governance workstreams as prioritized (Data Quality, Data literacy, Data Lineage/Architecture, Data Privacy by Design, Data Protection workstreams). Support WM IT DCIO team in driving the Data initiatives transversally with WM Leadership, Application Development & Maintenance (ADM), Engineering & Production (E&P), Security (CISO) teams. Support the development & testing of software / applications for Data Management. Note: DCIO complements Chief Data Office (CDO) within Wealth Management IT organization. Responsibilities Direct Responsibilities Work closely with WM IT DCIO team to execute the Data Governance Implementation across Data Initiatives e.g., RAMI (Data Retention), Data Privacy by Design, Data Quality, etc. Create and test proof-of-concept / solutions to support the strategic evolution of the software applications. Data Governance SME within Wealth Management operationally working with Data Custodian IT Officer (DCIO), DPO (Data Protection Officer), CDO (Chief Data Officer) teams. Hands-on with the development, testing, configuration, deployment of software systems in the Data Transversal organization Operationalize Data Policies / Frameworks including Business Glossaries, Data Dictionaries, Data Profiling, etc. Technical & Behavioral Competencies Minimum 7+ years of experience as: Data expertise (At least 2 of the following: Data Governance, Data Quality, Data Privacy & Protection, Data Management) Bachelors degree in Engineering (Computer science or Electronic & Communications) Qualifications: Hands-on experience in working with Data (Data Profiling, Scorecards/BI) Previously worked in Data Governance and Data Security Financial Services products and Applications knowledge Working knowledge across Excel, SQL, Python, Collibra, PowerBI, Cloud Plus: Collibra Developer or Ranger Certified or similar certification is preferred. Skills required: Knowledge about Data and Compliance / Regulatory environment( global and local Data Regulations) Demonstrates flexibility and willingness to accept assignments and challenges in rapidly changing environment. Understand how data is used (e.g., Analytics, Business Intelligence, etc.) Working knowledge on Data lifecycle, and Data Transformations / Data Lineage At least 2 of the following: Data Quality, Data Architecture, Database Management, Data Privacy & Protection, Security of data Ability to define relevant key performance indicators (KPI) Problem solving and team collaboration Self-motivated and results driven Project management and business analysis Agile thinking Transversal skills: Proficient in design new process, adaptation of Group IT processes to Wealth Management IT Strong communication to elaborate across stakeholders, and support change Minimum 7 years of experience in Data / Tech .
Posted 1 week ago
4.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Responsibilities: Design, develop, and maintain scalable data pipelines using modern frameworks and tools Build data models and transform raw data into clean, usable formats for analytics and machine learning Collaborate with analysts, data scientists, and product teams to gather requirements and deliver data solutions Ensure data quality through testing, validation, and monitoring Optimize data workflows for cost, performance, and reliability Document data flows, definitions, and technical decisions clearly and consistently Contribute to the evolution of data engineering best practices and internal tooling Qualifications: 4 6 years of hands-on experience in data engineering or related fields Proficient in Python or Scala and advanced SQL Experience with orchestration tools such as Airflow, DBT, or similar Familiar with modern cloud data platforms (e.g., Snowflake, Databricks, Redshift, BigQuery) Solid understanding of ETL/ELT practices, data modeling, and performance tuning Exposure to data governance, cataloging, and security concepts Experience working with cloud services (AWS, Azure, or GCP) Strong problem-solving skills and the ability to work both independently and collaboratively. Data Engineer, Scala, Etl, Airflow, Sql, Dbt
Posted 1 week ago
6.0 - 9.0 years
27 - 42 Lacs
Pune
Work from Office
Job Summary The Sr. Developer role focuses on designing developing and implementing data solutions using IBM Infosphere Datastage and PL/SQL. With a hybrid work model the candidate will contribute to optimizing data warehousing processes and ensuring seamless data integration. This position requires a minimum of 6 years of experience in ETL and scheduling basics with a preference for expertise in account management and cash management domains. Responsibilities Design and develop efficient ETL processes using IBM Infosphere Datastage to ensure seamless data integration and transformation. Implement data warehousing concepts to optimize storage and retrieval processes enhancing overall data management efficiency. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications for data solutions. Oversee the scheduling of data processes to ensure timely execution and delivery of data insights to stakeholders. Provide technical expertise in PL/SQL to develop robust database solutions that support business operations. Analyze existing data systems and recommend improvements to enhance performance and scalability. Ensure data quality and integrity by implementing rigorous testing and validation procedures throughout the development lifecycle. Troubleshoot and resolve data-related issues to maintain smooth operations and minimize downtime. Document technical processes and solutions to facilitate knowledge sharing and support future development efforts. Stay updated with industry trends and advancements in data integration technologies to drive innovation within the team. Support the hybrid work model by effectively managing tasks both remotely and on-site ensuring consistent productivity. Communicate effectively in English to collaborate with team members and present findings to stakeholders. Contribute to the companys purpose by developing data solutions that enhance decision-making and impact society positively. Qualifications Possess strong technical skills in data warehousing concepts ETL and data integration with hands-on experience in IBM Infosphere Datastage. Demonstrate proficiency in PL/SQL for developing and optimizing database solutions. Have a solid understanding of scheduling basics to manage data processes efficiently. Experience in account management and cash management domains is a plus providing valuable insights into financial data handling. Exhibit excellent communication skills in English both written and spoken to collaborate effectively with team members. Adapt to the hybrid work model showcasing flexibility and productivity in both remote and on-site settings. Show a proactive approach to learning and applying new technologies to enhance data solutions.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
Position Purpose Located within the RISK Function of BNP Paribas (BNPP), the role of the Data Protection Correspondent (DPC) is to ensure that the components of the operational risk management framework are implemented and operating effectively within ISPL, and to provide RISK ORM management and Business senior management with relevant, synthetic, transparent, exhaustive and consistent information and a front-to-back view of operational risk across ISPL activities. To achieve this objective, this 2nd line of defense (LOD2) role works closely with RISK ORM Regional and Central teams and with ISPL management and stakeholders. The DPC provides expertise on personal data protection related topics in accordance with the relevant RACI. India DPC must assist India Data Protection Officer (DPO) in supervising the compliance of projects and with legal and regulatory personal data protection requirements throughout the APAC region as well as the Groups and APAC personal data protection policies. RISK ORM ISPL mandate is to independently challenge and supervise the operational risk management framework of ISPL activities as described in level 2 procedure Organizational framework and governance for Operational Risk Management & Permanent Control Framework. This includes control framework adequacy checks, independent challenge, proximity with the business and contribution to the sign-off process on key decisions. The DPC is to ensure second level controls by providing the required supervision and assistance to the 1st Line of Defense Due to the global and regional models applied by the BNP Paribas (BNPP) activities outsourced to ISPL, the role covers the contribution as well to reviews, control testing, analysis and reports carried out under the supervision of the APAC DPO Regional teams. Responsibilities Direct Responsibilities To contribute to relevant personal data protection activities realization To guarantee required norms and methods definition and application to a companys good data protection risks apprehension (follow-up of projects, information systems adaptation, declarations conception and maintenance, subcontractors contracts analysis, follow-up on control plans reporting, etc.) To guarantee advice and assistance to strategical program ongoing. To support the implementation of the privacy strategy defined by DPO To assist the DPO in the supervision and monitoring of implementation of the Group's Data Protection policies and guidelines, bearing the local regulatory requirements in mind, to ensure consistency To define action plans and corrections related, and to ensure application of the same To alert DPO when activity is under operational risk (non-appropriateness between needs and resources, etc.), to propose correction solutions and to implement those solutions To contribute to continuous efficiency improvement and to any optimization process. To contribute to operational collaborative activities To support and assist APAC DPO team for control campaigns, typical DPO and RISK ORM activities in BAU (e.g. RCSA check & challenge, data breach assessments, project and third-party risk assessment support see below), but also in case of emergencies and escalated issues To contribute to permanent control actions To contribute to perform LOD2 controls and challenge LOD1 To contribute to perform the check and challenge of the RCSA To contribute to RISK ID exercise To contribute to OR&C report To ensure professional network development To participate in local Data Protection Committees when requested by the DPO To contribute to Internal Control Committee To collaborate with local CROs and RISK teams Contributing Responsibilities To assist the DPO on exchanges with the authorities in charge of the protection of personal data under the responsibility of the DPO To assist the DPO in the supervision and implementation of Privacy by Design principles throughout the lifecycle of all projects, activities, products, services, processes and systems To contribute to role development by validating data protection requirements for new activities, new products, services or specific operations, and to carry technical assistance To receive, process and advise internal and external local solicitations about data protection To receive, process and advise requests from data subjects, subcontractors and partners etc. To itemize existing processes and identify breaches regarding data protection requirements using your broad knowledge on APAC-wide local regulation (at minimum: Indias new DPDPA & GDPR requirements To contribute to perform risk assessment on personal data breaches To assist the DPO in monitoring documentation, e.g. the RoPA (Register of Processing Activities) To contribute to the identification and notification process for data protection violations according to defined procedures and local legal requirements To realize effectiveness for data protection controls and to ensure expected reporting To ensure regular reporting to DPO about the activity To contribute to the creation and implementation of awareness programs and to the promotion of a culture of protection of personal data within the scope of responsibility. * DPO may refer to India DPO or APAC DPO or Business Line DPO as the case may be reflecting a matrix organization while maintaining a direct reporting to the India DPO Technical & Behavioral Competencies Knowledge (Required to exercise the position) Level * To know standards and norms about data protection 1 Know-how (implementation of technics, methods, tools to achieve activities) Level * Technics To know how to assess maturity level of the existing facility about Data Privacy 1 Transverse To have a professional face-to-face or phone discussion with an overseas colleague 1 To prioritize 1 To efficiently manage several topics at the same time 1 To issue advice / recommendation considering every parameter 1 To have an efficient speaking communication 1 Tools To work with BNP Paribas tools (e.g. Data Protection Hub, RISK360) 2 Behavioral and soft skills To efficiently multi-task with topics and maintain attention to detail / rigor 1 To issue advice / recommendation considering all parameters 1 To have efficient communication skills (oral & written) 1 To conceptualize / formalize an idea, a process or a project 2 To work as a team / transversally 1 To identify and analyse risks for the activities that are handled 1 To assess, issue an opinion 1 To deploy a strategy and to define an action plan 2 To animate resources and coordinate their intervention 1 To show diplomacy to allow a message to be heard 1 To show conviction, to generate interlocutors acceptance 1 Being able to anticipate and come up with ideas 2 Creativity and innovation 2 To show discretion about delicate and / or confidential topics 1 Ability to manage conflict 2 To integrate multicultural dimension 1 * Level: Level 1: Deep Level 2: Intermediary Level 3: Basic Specific Qualifications Legal background with IAPP Certification (CIPP/E) or equivalent Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Attention to detail / rigor Creativity & Innovation / Problem solving Client focused Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and leverage networks Ability to develop and adapt a process Ability to understand, explain and support change Ability to set up relevant performance indicators Education Level: Bachelor Degree or equivalent Experience Level At least 5 years Other/Specific Qualifications (if required) Business Skills 1. Data Protection 2. Risk knowledge and awareness 3. Risk anticipation 4. Data quality & Security 5. Regulatory 6. Business analytics 7. New Technologies and Digital Law [IT/IP] 8. IT risk and cyber security .
Posted 1 week ago
2.0 - 6.0 years
5 - 8 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Description KPI Partners is seeking an enthusiastic and skilled Data Engineer specializing in STIBO (STEP) development to join our dynamic team. As a pivotal member of our data engineering team, you will be responsible for designing, developing, and implementing data solutions that meet the needs of our clients. This role requires a strong understanding of data management principles along with technical expertise in the STIBO STEP platform. Key Responsibilities - Design and develop data models and solutions using STIBO STEP for effective Master Data Management (MDM). - Collaborate with data architects, data analysts, and business stakeholders to gather requirements and translate them into technical specifications. - Implement and maintain ETL processes for data extraction, transformation, and loading to ensure data integrity and reliability. - Optimize data pipelines and workflows for performance and efficiency. - Monitor data quality and implement best practices for data governance. - Troubleshoot and resolve technical issues related to STIBO STEP development and data processes. - Provide technical support and guidance to team members and stakeholders regarding best practices in data management. Qualifications. - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as a Data Engineer or in a similar role, with a focus on STIBO (STEP) development. - Strong understanding of Master Data Management concepts and methodologies. - Proficiency in data modeling and experience with ETL tools and data integration processes. - Familiarity with database technologies such as SQL Server, Oracle, or PostgreSQL. - Excellent problem-solving skills and the ability to work independently as well as part of a team. - Strong communication skills to effectively collaborate with technical and non-technical stakeholders. - Experience with data visualization tools is a plus. What We Offer. - Competitive salary and performance-based incentives. - Opportunity to work on innovative projects in a collaborative environment. - Professional development and training opportunities to enhance your skills. - A flexible work environment that promotes work-life balance. - A vibrant company culture that values creativity and teamwork. If you are passionate about data engineering and want to play a crucial role in shaping our clients' data strategies, we would love to hear from you! Apply now to join KPI Partners in delivering impactful data solutions. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 1 week ago
4.0 - 9.0 years
25 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Exp- 4-7yrs Skillset- IDMC (preferably) or any other DG tool, SQL, ETL Location- Gurgaon (Preferred), Bangalore, Pune, NP- URGENT Work with Customers onshore team collaboratively to support following initiatives: Interface with business stakeholders, understand their data and analytics needs, establish requirement with technical stakeholders and align on delivery plan. Understand various data sources around asset classes, portfolio, historical performances, market trends etc. and Develop/enhance data documentation. Help deliver data-driven analysis and recommendations that effectively influence business decisions. Extract data, perform data cleansing / data quality checking tasks, prepare data quality reports, and model ready data. Candidate Profile: Over 4 years of experience in data analytics, governance, and business analysis Strong understanding of data analytics and ability to derive actionable insights Skilled in developing strategic project roadmaps and aligning data initiatives with business goals Proactive in proposing suggestions and providing regular project updates to stakeholders Hands-on experience with data governance frameworks; Collibra knowledge helpful but not mandatory. Strong comprehension of metadata strategies and real-world use cases Excellent communication skills and ability to work across business and technical teams Familiar with technology stack: SQL, Snowflake, Power BI Experience with IceDQ (a plus) Understanding of investment fundamentals is a valuable asset Detail-oriented, self-motivated, and adept at cross-functional collaboration
Posted 1 week ago
12.0 - 14.0 years
20 - 27 Lacs
Bengaluru
Work from Office
Job Description & Summary: We are seeking an experienced Senior Data Architect to lead the design and development of our data architecture, leveraging cloud-based technologies, big data processing frameworks, and DevOps practices. The ideal candidate will have a strong background in data warehousing, data pipelines, performance optimization, and collaboration with DevOps teams. Responsibilities: 1. Design and implement end-to-end data pipelines using cloud-based services (AWS/ GCP/Azure) and conventional data processing frameworks. 2. Lead the development of data architecture, ensuring scalability, security, and performance. 3. Collaborate with cross-functional teams, including DevOps, to design and implement data lakes, data warehouses, and data ingestion/extraction processes. 4. Develop and optimize data processing workflows using PySpark, Kafka, and other big data processing frameworks. 5. Ensure data quality, integrity, and security across all data pipelines and architectures. 6. Provide technical leadership and guidance to junior team members. 7. Design and implement data load strategies, data partitioning, and data storage solutions. 8. Collaborate with stakeholders to understand business requirements and develop data solutions to meet those needs. 9. Work closely with DevOps team to ensure seamless integration of data pipelines with overall system architecture. 10. Participate in design and implementation of CI/CD pipelines for data workflows. DevOps Requirements: 1. Knowledge of DevOps practices and tools, such as Jenkins, GitLab CI/CD, or Apache Airflow. 2. Experience with containerization using Docker. 3. Understanding of infrastructure as code (IaC) concepts using tools like Terraform or AWS CloudFormation. 4. Familiarity with monitoring and logging tools, such as Prometheus, Grafana, or ELK Stack. Requirements: 1. 12-14 years of experience for Senior Data Architect in data architecture, data warehousing, and big data processing. 2. Strong expertise in cloud-based technologies (AWS/ GCP/ Azure) and data processing frameworks (PySpark, Kafka, Flink , Beam etc.). 3. Experience with data ingestion, data extraction, data warehousing, and data lakes. 4. Strong understanding of performance optimization, data partitioning, and data storage solutions. 5. Excellent leadership and communication skills. 6. Experience with NoSQL databases is a plus. Mandatory skill sets: 1. Experience with agile development methodologies. 2. Certification in cloud-based technologies (AWS / GCP/ Azure) or data processing frameworks. 3. Experience with data governance, data quality, and data security. Preferred skill sets: Knowledge of AgenticAI and GenAI is added advantage
Posted 1 week ago
5.0 - 8.0 years
5 - 12 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Detailed JD *(Roles and Responsibilities) Understand the current data governance structure of the organization and draft a data governance charter and operating model with roles and responsibilities on levels of operating model. Create a glossary with terms and definitions, mapping between logical elements and physical elements, and simple and complex relations for mapping. Set up Collibra communities, domains, types, attributes, status, articulation, workflow, and customize attribution. Identify and prioritize data domains for data governance based on business value and ease of implementation. Define roles and responsibilities governing communities and assets within the Collibra environment. Recommend and implement workflows to govern metadata. Engage with client SMEs to identify key business terms from shared documents to be showcased in Collibra as part of the business glossary. Identify key attributes like definition, criticality, security classification, purpose, etc., associated with the business terms. Create templates to gather the information required about business term attributes and technical metadata. Automate the manual data demand process by configuring and implementing workflows. Create end-to-end lineage in Collibra DGC based on the analysis performed and display the lineage in visual format for business users. Document best practices and provide training to stakeholders. Mandatory skills* Collibra is the mandatory skill
Posted 1 week ago
6.0 - 10.0 years
22 - 25 Lacs
Mumbai, Hyderabad
Work from Office
About the role As Master Data Management Manager, you will manage a cluster of technology platforms, continuously evaluate technology solutions induct an innovative technology stack to drive business excellence at ICICI Bank. You will work along with the cross functional business teams in creating technology solutions by leveraging digital data capabilities induct new age technologies. In this role, you will have opportunities to ideate, develop, manage, maintain improvise our digital offerings also our internal tools/platforms. Key Responsibilities Design and Develop Designing and developing customized MDM code based on Specification. Customization of MDM using MDM features such as Extension, Additions and Business proxies, rule, Services. Support Design, develop, Conduct Unit Testing Cases of new releases of SW components within the MDM repository. Be Up-to-Date Committed to learning and expanding professional and technical knowledge in master data management processes tools, data modeling data integration. Key Qualifications & Skills Educational Qualification B.E /B. Tech/M. E/ M. Tech with 6 to 10 years of relevant experience in IBM Infosphere Master Data Management Expertise in IBMs MDM version 11.x product, hands-on implementation experience, preferably in Banking domain. Experience with IBM MDM Customization includes Extension, Addition, Business proxies, SDP, Match-Merge Rules, Event Manage Support Provide support in understanding of OSGI architecture in terms of MDM customization, deployment and how to troubleshoot failures. Expert Java development background with RSA/ RAD, MDM Workbench, SOAP Web Services, XML, XSD, WSDL Communication skills Excellent oral and written communication skills
Posted 1 week ago
5.0 - 15.0 years
7 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Health Care - Data Engineer Architect Job Title: Health Care - Data Engineer Architect Location: Chennai,Bangalore,Hyderabad Experience: 5 - 15 Years Job Title: Job Summary We are seeking a highly experienced Healthcare Data Architect to design and manage robust healthcare data systems that meet regulatory requirements, enable interoperability, and support advanced analytics. The ideal candidate will bring over a decade of expertise in healthcare IT, data modelling, and cloud-based solutions to architect scalable, secure systems that serve EHRs, population health, claims, and real-time analytics use cases. Mandatory Skills Data Architecture (Healthcare domain) HL7, FHIR, EHR/EMR systems Data Warehousing & Data Lakes Cloud Platforms (AWS, Azure, GCP) Data Governance & MDM SQL, NoSQL, Python, Spark Key Responsibilities Architect and implement scalable healthcare data platforms Ensure compliance with HIPAA, GDPR, and other healthcare regulations Design data models for EHR, claims, and clinical data Optimize ETL pipelines and manage data flow integrity Lead data warehouse and data lake development Drive interoperability through standards like HL7 and FHIR Implement data governance and quality frameworks Qualifications Bachelor s or Master s in Computer Science, Information Systems, or related field Certifications in AWS/Azure and healthcare standards (HL7/FHIR) preferred Technical Skills SQL, Python, Spark, Java HL7, FHIR, CCD, JSON, XML Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory), GCP BI: Power BI, Tableau Data modeling tools: Erwin, Enterprise Architect Soft Skills Strong analytical and problem-solving ability Excellent communication & stakeholder engagement Team leadership and mentoring Adaptability in fast-paced environments Good to Have Experience with AI/ML in healthcare pipelines Familiarity with population health & claims analytics Regulatory reporting experience (CMS, NCQA) Minimum 10 years in data architecture, with 5+ years in healthcare domain Proven track record in implementing full-cycle data solutions and governance Competitive salary + performance incentives Comprehensive health insurance & wellness programs Learning and development allowance Remote/hybrid flexibility ESOPs for senior leadership (if applicable) Key Result Areas (KRAs) Scalable & compliant data architecture delivery HL7/FHIR integration and uptime Timely milestone delivery & cross-functional collaboration Quality, consistency, and governance of healthcare data Key Performance Indicators (KPIs) Reduction in ETL/data latency and failures Improvement in data quality metrics On-time solution deployment success rate Audit pass rate and compliance score Contact Informations: Click here to upload your CV / Resume We accept PDF, DOC, DOCX, JPG and PNG files SUBMIT APPLICATION Verification code successfully sent to registered email Invalid Verification Code! Thanks for the verification! Our support team will contact you shortly!.
Posted 1 week ago
4.0 - 8.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Job Title: Associate Specialist- Data Engineering Location: Bengaluru Shift : UK Shift About the Role: We are seeking a skilled and experienced Data Engineer to join our team and help build, optimize, and maintain data pipelines and architectures. The ideal candidate will have deep expertise in the Microsoft data engineering ecosystem, particularly leveraging tools such as Azure Data Factory , Databricks , Synapse Analytics , Microsoft Fabric , and a strong command of SQL , Python , and Apache Spark . Key Responsibilities: Design, develop, and optimize scalable data pipelines and workflows using Azure Data Factory , Synapse Pipelines , and Microsoft Fabric . Build and maintain ETL/ELT processes for ingesting structured and unstructured data from various sources. Develop and manage data transformation logic using Databricks (PySpark/Spark SQL) and Python . Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality data solutions. Ensure data quality, integrity, and governance across the data lifecycle. Implement monitoring and alerting for data pipelines to ensure reliability and performance. Work with Azure Synapse Analytics to build data models and enable analytics and reporting. Utilize SQL for querying and managing large datasets efficiently. Participate in data architecture discussions and contribute to technical design decisions. Required Skills and Qualifications: data engineering or a related field. Strong proficiency in the Microsoft Azure data ecosystem including: Azure Data Factory (ADF) Azure Synapse Analytics Microsoft Fabric Azure Databricks Solid experience with Python and Apache Spark (including PySpark). Advanced skills in SQL for data manipulation and transformation. Experience in designing and implementing data lakes and data warehouses. Familiarity with data governance, security, and compliance standards. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Microsoft Azure certifications (e.g., Azure Data Engineer Associate). Experience with DevOps tools and CI/CD practices in data workflows. Knowledge of REST APIs and integration techniques. Background in agile methodologies and working in cross-functional teams.
Posted 1 week ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Transport is at the core of modern society. Imagine using your expertise to shape sustainable transport and infrastructure solutions for the future? If you seek to make a difference on a global scale, working with next-gen technologies and the sharpest collaborative teams, then we could be a perfect match. Who we are The Vehicle Data Management and Analytics group is looking for a new member in the area of diagnostics and technical documentation. We are part of the Vehicle Data & AI team within Vehicle Technology at Volvo Group Trucks Technology. What will you do You will work within the SEWS (System Engineering Web Service) Approval Board team. The work will involve technical reviews, user support and maintenance of process documentation related to the diagnostic area and the approval process. Typical focus areas are the documentation of diagnostic objects like parameters, on-vehicle tests, and Diagnostic Trouble Codes (DTC). Our team supports all business areas within Volvo Group in this area. Our aim is to ensure good data documentation quality that is accurate and easily understood as well as easy to maintain. This is done to safeguard excellent diagnostics and data quality within the Volvo Group. Core responsibilities Perform technical reviews and suggest updates (texts and setup) Support our users through the review process Support in upholding data quality Create, maintain and improve guidelines Continuously drive process and system improvements as a key user Develop and hold training sessions for users Coordinate and follow up with various stakeholders Required skills Minimum 5 years of experience within the automotive industry Excellent technical English language skills Excellent technical documentation/writing skills Excellent social and communication skills Good understanding of embedded software development process Good insights in automotive diagnostics and communication protocols Be structured and have good organizational skills Have a get-it-done attitude but still have a focus on delivering quality Must be able to work both individually and within a team Preferred skills and experiences B.E. in areas like Mechatronics, Electronics Engineering or similar Automotive industry product knowledge Experience within Volvo Group Trucks Technology will be an advantage Experience of working with Jira and Agile Process Have a pedagogical approach Have experience from ESW (Embedded Software) development and process will be an advantage We value your data privacy and therefore do not accept applications via mail. Who we are and what we believe in We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group s leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group . Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment. Group Trucks Technology are seeking talents to help design sustainable transportation solutions for the future. As part of our team, you ll help us by engineering exciting next-gen technologies and contribute to projects that determine new, sustainable solutions. Bring your love of developing systems, working collaboratively, and your advanced skills to a place where you can make an impact. Join our design shift that leaves society in good shape for the next generation.
Posted 1 week ago
3.0 - 8.0 years
15 - 16 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
We are looking forward to hire Snowflake Professionals in the following areas : JD as below Snowflake SnowSQL, PL/SQL Any ETL Tool : 3+ years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool. Good interpersonal skills, experience in handling communication and interactions between different teams. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Gurugram
Work from Office
Description Data Analyst II Syneos Healthis a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities, Our Clinical Development model brings the customer and the patient to the center of everything that we do We are continuously looking for ways to simplify and streamline our work to not only make Syneos Health easier to work with, but to make us easier to work for, Whether you join us in a Functional Service Provider partnership or a Full-Service environment, youll collaborate with passionate problem solvers, innovating as a team to help our customers achieve their goals We are agile and driven to accelerate the delivery of therapies, because we are passionate to change lives, Discover what our 29,000 employees, across 110 countries already know: WORK HERE MATTERS EVERYWHERE Why Syneos Health We are passionate about developing our people, through career development and progression; supportive and engaged line management; technical and therapeutic area training; peer recognition and total rewards program, We are committed to our Total Self culture where you can authentically be yourself Our Total Self culture is what unites us globally, and we are dedicated to taking care of our people, We are continuously building the company we all want to work for and our customers want to work with WhyBecause when we bring together diversity of thoughts, backgrounds, cultures, and perspectives were able to create a place where everyone feels like they belong, Job Responsibilities Work independently to solve open-ended questions, Design and analyze tests and experiments, Maintain documentation of analytical processes and projects, Build, maintain, and improve performance dashboards leveraging customer feedback for use and accessibility, Advise clients on relevant best practices and ensure the data is easily retrievable for their review, Support data quality and understanding customer needs as they evolve, Mentor and coach junior team members, Support site advocacy group meetings by inviting PIs, discussing blinded protocols, collecting feedback, and managing scheduling, hosting, and meeting minutes, Develop and manage capabilities decks twice annually, along with bespoke slides and marketing information sheets using Power BI data, Track and analyze business development outcomes through opportunity trackers, monitoring RFP success rates, regulatory approvals, and win rates, Monitor customer satisfaction by reviewing feedback from the EM team and facilitating monthly cross-time zone communications, Oversee product approval tracking, ensuring visibility into product lifecycle status and final approval outcomes, Qualifications Get to know Syneos Health Over the past 5 years, we have worked with 94% of all Novel FDA Approved Drugs, 95% of EMA Authorized Products and over 200 Studies across 73,000 Sites and 675,000+ Trial patients, No matter what your role is, youll take the initiative and challenge the status quo with us in a highly competitive and ever-changing environment Learn more about Syneos Health, http://syneoshealth, Additional Information Tasks, duties, and responsibilities as listed in this job description are not exhaustive The Company, at its sole discretion and with no prior notice, may assign other tasks, duties, and job responsibilities Equivalent experience, skills, and/or education will also be considered so qualifications of incumbents may differ from those listed in the Job Description The Company, at its sole discretion, will determine what constitutes as equivalent to the qualifications described above Further, nothing contained herein should be construed to create an employment contract Occasionally, required skills/experiences for jobs are expressed in brief terms Any language contained herein is intended to fully comply with all obligations imposed by the legislation of each country in which it operates, including the implementation of the EU Equality Directive, in relation to the recruitment and employment of its employees The Company is committed to compliance with the Americans with Disabilities Act, including the provision of reasonable accommodations, when appropriate, to assist employees or applicants to perform the essential functions of the job,
Posted 1 week ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
. This role supports the Customer Experience and Commercial Transformation (CXCT) organization, driving the data transformation agenda for UNIFY and CXCT. UNIFY is Schneider Electric s global business transformation initiative, integrating business, supply chain, and finance processes into a standardized, simplified global model. This transformation is enabled by the adoption of SAP Public Cloud (market standard processes), requiring re-engineering of processes, tools, data, and operating models to align with market-standard practices. CXCT and UNIFY Collaboration Partner with multiple streams (Domain/Sub-domain Data Teams, Business Process Owners, Central Governance, Data Domain Owners, etc.) to clarify and drive the Data Transformation for business transformation. Develop and implement a detailed plan that encompasses all data activities aligned with the goals of the transformation program, with a specific focus on unifying ERP systems within SAP S/4 HANA. Master Data Leadership o Serve as a data and technology leader for master data within the transformation program. o Act as a subject matter expert for the Order-to-Cash data stream, understanding global and local business flows and related data dependencies. o Analyze end-to-end data flows across applications and SAP Public Cloud, identifying impacts and opportunities for optimization. Data Architecture & Modeling o Enrich and align the target data model with business requirements and best practices. o Identify data owners and authoritative sources for each attribute to ensure accountability and traceability. o Define critical data elements (CDEs) for the data objects. Data Quality & Governance o Engage with data domain leaders to define governance requirements and implement tailored governance models. o Enforce data governance processes to uphold data integrity, compliance, and security. o Support the adoption of SAP Master Data Governance (MDG) and other relevant platforms. Data Migration o Understand business requirements and support the definition of the data migration strategy at the object level. o Define the scope of data objects and attributes to be included in migration activities. o Identify data cleansing rules and ensure their application across relevant data sets. o Ensure alignment of migration activities with transformation goals and data readiness. CXCT Data Office Responsibilities Cascade and adapt the data strategy and vision to all relevant teams including, but not limited to data domain offices. Clearly communicate strategy, agenda, priorities, ways of working to senior leaders, unit members and relevant stakeholders in the domain data offices Re-configure/Hire/Elevate resources to build the virtual delivery pool for DCR (e.g. tapping into the distributed resources) Establish o Convergence : standardization of data problem solving, data, tech, people practices and adoption of common language and change management o Cross-leverage : leveraging data assets, tech assets, tools and processes from across business functions and establishing global benchmarks) o Collaboration : build the right organizational structure and governance to share knowledge and scale Initiate data capability building and change management through workshops. Set-up or leverage current communities of practice (data governance, master data management) for value cross-pollination. Human Resources Responsibilities Support UNIFY data resources identification and hiring - Onboard with a minimal ramp up time. Support the build and management of the data operations network in domains and geos, leveraging the data business officer network. Provide mentorship for the purpose of developing a continuous talent pipeline for key roles in the CXCT data offices. Technical Requirements Over 10 years of experience in SAP S/4HANA implementation, with a strong understanding of Order to Cash business processes and data management. Expertise in driving the strategy, vision, and mission for data management, including but not limited to Master Data initiatives. Proven experience in identifying data requirements for relevant SAP business processes and ensuring that the necessary data is available. Hands-on experience in managing SAP data, including data mapping, migration, data quality, and analytics. Familiarity with SAP Master Data Governance and its capabilities is a plus. Experience in developing and implementing data quality improvement frameworks. Understanding of global data privacy and security guidelines and best practices is a plus. Fluent understanding of the technical architecture within a complex application landscape. Professional Competency Requirements Excellent interpersonal skills, strong verbal and written communication skills, attention to details, and intuitive storytelling ability Excellent problem-solving, organizational, and analytical skills with the ability to make high-velocity decisions quickly and based on data, industry trends and stakeholder feedback. Confidence, engaging leader, able to communicate a vision and bring people along on the journey. Pro-active problem solver with an ability to take initiative and have a strong sense of ownership over shaping and executing upon our analytics transformation initiatives. A change agent attitude, constantly pushing for new opportunities, more efficient processes, and new perspectives. The successful candidate will have an exceptional track record of working within businesses to tackle large-scale business problems, across multiple domains and functions. You must have commercial acumen, ability to understand business problems, exceptional interpersonal skills, and strong background in the data field. Furthermore, you will have a demonstrable track record of successfully building and managing scaled, high-quality, data solutions for internal or external organizations with high adoption rates. You must be a self-motivated, team player who can contribute to the overall business objectives of the organization.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane