Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
13 - 18 Lacs
Mumbai
Work from Office
Paramatrix Technologies Pvt. Ltd is looking for Cloud Data Architect (Azure & AWS) to join our dynamic team and embark on a rewarding career journey A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture 1 Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security 2 Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance 3 Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills 4 Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues
Posted 1 month ago
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the team: The Credit Risk Product Team is at the core of our lending operations, ensuring that our risk assessment models are efficient, scalable, and compliant with regulatory frameworks. The team collaborates closely with data scientists, engineers, and business stakeholders to develop and enhance credit risk models, decisioning frameworks, and risk mitigation strategies. By leveraging advanced analytics and machine learning, the team continuously refines underwriting processes to optimize loan performance and reduce defaults. About the Role: W e are looking for a motivated and curious AI & Data Product Analyst to join our team. This role requires a deep understanding of real time and batch data pipelines and data modelling, including data ingestion, preparation, and integration to support various business cases. The analyst will be the subject matter expert on data availability and structure, guiding stakeholders on how best to leverage existing data assets. The analyst will continuously analyze the data assets and identify opportunities to optimize for cost and processing efficiency. The role also requires a deep understanding of data architecture for conversational AI systems, The analyst will also analyze user interactions and product performance to generate actionable insights that drive continuous improvement of AI-driven conversational experiences. You will work closely with data scientists, engineers, product managers, and business stakeholders to understand data requirements, design robust data solutions, and ensure the accuracy and availability of data for our analytical tools. Expectations/ : 1. Collaborate with business teams to understand their data, analytical requirements. 2. Work closely with engineering teams to ensure business needs are met through effective feature development. 3. Contribute to the development and optimization of big data architecture and data pipelines for real-time and batch processing. 4. Support the definition and design of data architecture and pipelines required for conversational AI bots, including data ingestion, processing, annotation, and storage. 5. Analyze interaction data to inform the development and refinement of intent taxonomies, entity extraction, and dialogue management based on data-driven insights. 6. Analyze conversational AI interaction data to extract insights on user behavior, intent recognition accuracy, and dialogue effectiveness.Collaborate with AI engineers to integrate AI frameworks like LangChain, LangFlow, or other agentic AI platforms into product workflows. 7. Monitor and evaluate AI bot performance metrics, identify data quality issues, and recommend improvements.Translate business requirements into technical specifications for data architecture and AI product enhancements. 8. Stay updated on emerging conversational AI frameworks and tools, evaluating their applicability to business needs. Key Skills Required: 1.Ideally have 2-5 years experience working on data analytics and business intelligence. Candidates from b2c consumer internet product companies are preferred. 2. Proven work experience on MS Excel, Google analytics,SQL, Data Studio, any BI Tool, business analyst or similar role. 3. Should be comfortable working in a fast-changing environment and ambiguous. 4. Critical thinking and very detail oriented. 5. In-depth understanding of datasets, data and business understanding. 6. Capable of demonstrating good business judgement. Education: Applicants must have an engineering academic background with specialization in data science . Why join us We aim at bringing half a billion Indians into the mainstream economy, and everyone working here is striving to achieve that goal. Our success is rooted in our people’s collective energy and unwavering focus on the customers, and that’s how it will always be. We are the largest merchant acquirer in India. Compensation If you are the right fit, we believe in creating wealth for you With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It is your opportunity to be a part of the story!
Posted 1 month ago
5.0 - 8.0 years
8 - 18 Lacs
Mumbai, Hyderabad, Pune
Hybrid
Role & responsibilities Design, implement, and manage cloud-based solutions on AWS and Snowflake. Work with stakeholders to gather requirements and design solutions that meet their needs. Develop and execute test plans for new solutions Oversee and design the information architecture for the data warehouse, including all information structures such as staging area, data warehouse, data marts, and operational data stores. Ability to Optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. Deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support Significant experience working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models) Maintain Documentation: Develop and maintain detailed documentation for data solutions and processes. Provide Training: Offer training and leadership to share expertise and best practices with the team Collaborate with the team and provide leadership to the data engineering team, ensuring that data solutions are developed according to best practices Preferred candidate profile Snowflake, DBT, Data Architecture Design experience in Data Warehouse. Good to have Informatica or any ETL Knowledge or Hands-On Experience Good to have Databricks understanding 5+ years of IT experience with 3+ years of Data Architecture experience in Data Warehouse, 4+ years in Snowflake
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Mumbai
Work from Office
Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools for Data Cataloging and Data Management (like Abacus, collibra, Alation, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 10 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified (Good to have)
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools (like Abacus, informatica, big data, collibra, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified.
Posted 1 month ago
7.0 - 10.0 years
30 - 40 Lacs
Pune
Work from Office
Architect Application/Product IV Minimum Education and Experience : T4 Bachelors Degree and 7 years experience Does this position have any direct reports? No Is travel required for this position? Minimal ESSENTIAL DUTIES AND RESPONSIBILITIES Work with business users and stakeholders to define and analyze problems and provide optimal technical solutions. Translate business needs into technical specifications and design functional BI solutions. Present architecture and solutions to executive-level. Adhere to industry best-practices in all phases of design and architecture of the solution. Ensure the robustness and reliability of BI solutions during development, testing, and maintenance. Document all aspects of the BI system for future upgrades and maintenance. Provide guidance to ensure data governance, security, and compliance best practices in the architecture. REQUIRED SKILLS & QUALIFICATIONS TECHNICAL SKILLS: Data Modeling: Expertise in dimensional modeling, normalization/denormalization and other data modeling techniques ETL Processes: Proficiency in extract, transform, and load (ETL) processes. SQL and Database Design: Strong SQL coding skills and knowledge of database design principles. BI Platforms: Experience with any of BI platforms preferably Power BI Data Warehousing: Knowledge of data warehousing concepts and tools Cloud Services: Experience with cloud platforms such as AWS, Azure, and Google Cloud Data Governance: Understanding of data governance and data quality management Scripting Languages: Proficiency in scripting languages like Python, R, or Java MINIMUM QUALIFICATIONS 8 + years of end-to-end design and architecture of enterprise level data platform and reporting/analytical solutions. 5+ years of expertise in real-time and batch reporting, analytical solution architecture. 4+ years of experience with PowerBI, Tableau or similar technology solutions ADDITIONAL QUALIFICATIONS 8 + years of experience with Dimensional modeling and data lake design methodologies. 8+ years of experience with Relational and Non-relational databases (e.g. SQL Server, Cosmos, etc.) Experience with working with business stakeholders, requirements & use case analysis. Strong communication and collaboration skills with creative problem-solving skills. PREFERRED QUALIFICATIONS Bachelor's degree in computer science or equivalent work experience. Experience with Agile/Scrum methodology. Experience with tax and accounting domain a plus. Azure Data Engineer certification a plus.
Posted 1 month ago
3.0 - 6.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Join us as a Data Engineer II in Bengaluru! Build scalable data pipelines using Python, SQL, AWS, Airflow, and Kafka. Drive real-time & batch data systems across analytics, ML, and product teams. A hybrid work option is available. Required Candidate profile 3+ yrs in data engineering with strong Python, SQL, AWS, Airflow, Spark, Kafka, Debezium, Redshift, ETL & CDC experience. Must know data lakes, warehousing, and orchestration tools.
Posted 1 month ago
5.0 - 15.0 years
7 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Health Care - Data Engineer Architect Job Title: Health Care - Data Engineer Architect Location: Chennai,Bangalore,Hyderabad Experience: 5 - 15 Years Job Title: Job Summary We are seeking a highly experienced Healthcare Data Architect to design and manage robust healthcare data systems that meet regulatory requirements, enable interoperability, and support advanced analytics. The ideal candidate will bring over a decade of expertise in healthcare IT, data modelling, and cloud-based solutions to architect scalable, secure systems that serve EHRs, population health, claims, and real-time analytics use cases. Mandatory Skills Data Architecture (Healthcare domain) HL7, FHIR, EHR/EMR systems Data Warehousing & Data Lakes Cloud Platforms (AWS, Azure, GCP) Data Governance & MDM SQL, NoSQL, Python, Spark Key Responsibilities Architect and implement scalable healthcare data platforms Ensure compliance with HIPAA, GDPR, and other healthcare regulations Design data models for EHR, claims, and clinical data Optimize ETL pipelines and manage data flow integrity Lead data warehouse and data lake development Drive interoperability through standards like HL7 and FHIR Implement data governance and quality frameworks Qualifications Bachelor s or Master s in Computer Science, Information Systems, or related field Certifications in AWS/Azure and healthcare standards (HL7/FHIR) preferred Technical Skills SQL, Python, Spark, Java HL7, FHIR, CCD, JSON, XML Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory), GCP BI: Power BI, Tableau Data modeling tools: Erwin, Enterprise Architect Soft Skills Strong analytical and problem-solving ability Excellent communication & stakeholder engagement Team leadership and mentoring Adaptability in fast-paced environments Good to Have Experience with AI/ML in healthcare pipelines Familiarity with population health & claims analytics Regulatory reporting experience (CMS, NCQA) Minimum 10 years in data architecture, with 5+ years in healthcare domain Proven track record in implementing full-cycle data solutions and governance Competitive salary + performance incentives Comprehensive health insurance & wellness programs Learning and development allowance Remote/hybrid flexibility ESOPs for senior leadership (if applicable) Key Result Areas (KRAs) Scalable & compliant data architecture delivery HL7/FHIR integration and uptime Timely milestone delivery & cross-functional collaboration Quality, consistency, and governance of healthcare data Key Performance Indicators (KPIs) Reduction in ETL/data latency and failures Improvement in data quality metrics On-time solution deployment success rate Audit pass rate and compliance score Contact Informations: Click here to upload your CV / Resume We accept PDF, DOC, DOCX, JPG and PNG files SUBMIT APPLICATION Verification code successfully sent to registered email Invalid Verification Code! Thanks for the verification! Our support team will contact you shortly!.
Posted 1 month ago
5.0 - 15.0 years
7 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Title: Data Architect Location: Chennai,Bangalore,Hyderabad Experience: 5 - 15 Years Job Title: Hyderabad, Chennai & Bangalore Job Summary We are seeking a highly experienced and strategic Data Architect to lead the design and implementation of robust data architecture solutions. The ideal candidate will have a deep understanding of data modelling, governance, integration, and analytics platforms. As a Data Architect, you will play a crucial role in shaping the data landscape across the organization, ensuring data availability, consistency, security, and quality. Mandatory Skills Enterprise data architecture and modeling Cloud data platforms (Azure, AWS, GCP) Data warehousing and lakehouse architecture Data governance and compliance frameworks ETL/ELT design and orchestration Master Data Management (MDM) Databricks architecture and implementation Key Responsibilities Lead, define, and implement end-to-end modern data platforms on public cloud using Databricks Design and manage scalable data models and storage solutions Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to define required data structures, formats, pipelines, metadata, and workload orchestration capabilities Establish data standards, governance policies, and best practices Oversee the integration of new data technologies and tools Lead the development of data pipelines, marts, and lakes Ensure data solutions are compliant with security and regulatory standards Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Mentor data engineers and developers on best practices Qualifications Bachelor s or Master s degree in Computer Science, Engineering, or a related field Relevant certifications in cloud platforms, data architecture, or governance Technical Skills Data Modelling: Conceptual, Logical, Physical modelling (ERwin, Power Designer, etc.) Cloud: Azure (ADF, Synapse, Databricks), AWS (Redshift, Glue), GCP (Big Query) Databases: SQL Server, Oracle, PostgreSQL, NoSQL (MongoDB, Cassandra) Data Integration: Informatica, Talend, Apache NiFi Big Data: Hadoop, Spark, Kafka Governance Tools: Collibra, Alation, Azure Purview Scripting: Python, SQL, Shell DevOps/DataOps practices and CI/CD tools Soft Skills Strong leadership and stakeholder management Excellent communication and documentation skills Strategic thinking with problem-solving ability Collaborative and adaptive in cross-functional teams Experience in AI/ML data lifecycle support Exposure to industry data standards and frameworks (TOGAF, DAMA-DMBOK) Experience with real-time analytics and streaming data solutions Work Experience Minimum 10 years in data engineering, architecture, or related roles At least 5 years of hands-on experience in designing data platforms on Azure Demonstrated knowledge of 2 full project cycles using Databricks as an architect Experience supporting and working with cross-functional teams in a dynamic environment Advanced working SQL knowledge and experience working with relational databases and unstructured datasets Experience with stream-processing systems such as Storm, Spark-Streaming Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plans. Work-Life Balance: Flexible work hours Key Result Areas (KRAs) Effective implementation of scalable and secure data architecture Governance and compliance adherence Standardization and optimization of data assets Enablement of self-service analytics and data democratization Key Performance Indicators (KPIs) Architecture scalability and reusability metrics Time-to-delivery for data initiatives Data quality and integrity benchmarks Satisfaction ratings from business stakeholders Contact Informations: Click here to upload your CV / Resume We accept PDF, DOC, DOCX, JPG and PNG files SUBMIT APPLICATION Verification code successfully sent to registered email Invalid Verification Code! Thanks for the verification! Our support team will contact you shortly!.
Posted 1 month ago
4.0 - 8.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Job Title: Associate Specialist- Data Engineering Location: Bengaluru Shift : UK Shift About the Role: We are seeking a skilled and experienced Data Engineer to join our team and help build, optimize, and maintain data pipelines and architectures. The ideal candidate will have deep expertise in the Microsoft data engineering ecosystem, particularly leveraging tools such as Azure Data Factory , Databricks , Synapse Analytics , Microsoft Fabric , and a strong command of SQL , Python , and Apache Spark . Key Responsibilities: Design, develop, and optimize scalable data pipelines and workflows using Azure Data Factory , Synapse Pipelines , and Microsoft Fabric . Build and maintain ETL/ELT processes for ingesting structured and unstructured data from various sources. Develop and manage data transformation logic using Databricks (PySpark/Spark SQL) and Python . Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality data solutions. Ensure data quality, integrity, and governance across the data lifecycle. Implement monitoring and alerting for data pipelines to ensure reliability and performance. Work with Azure Synapse Analytics to build data models and enable analytics and reporting. Utilize SQL for querying and managing large datasets efficiently. Participate in data architecture discussions and contribute to technical design decisions. Required Skills and Qualifications: data engineering or a related field. Strong proficiency in the Microsoft Azure data ecosystem including: Azure Data Factory (ADF) Azure Synapse Analytics Microsoft Fabric Azure Databricks Solid experience with Python and Apache Spark (including PySpark). Advanced skills in SQL for data manipulation and transformation. Experience in designing and implementing data lakes and data warehouses. Familiarity with data governance, security, and compliance standards. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Microsoft Azure certifications (e.g., Azure Data Engineer Associate). Experience with DevOps tools and CI/CD practices in data workflows. Knowledge of REST APIs and integration techniques. Background in agile methodologies and working in cross-functional teams.
Posted 1 month ago
2.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Key Responsibilities: Big Data Architecture: Design, build, and implement scalable Big Data solutions to process and analyze vast datasets in a timely and efficient manner. Data Pipeline Development: Develop ETL (Extract, Transform, Load) pipelines for large-scale data processing. Ensure data pipelines are automated, scalable, and robust to handle high volumes of data. Distributed Systems: Work with distributed computing frameworks (e.g., Apache Hadoop , Apache Spark , Flink ) to process big datasets across multiple systems and clusters. Data Integration: Integrate data from multiple sources (structured, semi-structured, and unstructured) into a unified data architecture.
Posted 1 month ago
10.0 - 15.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Total Experience: 10+ Years Relevant Experience: 10+ Years Rate: 11000 INR/day Interview Mode: One F2F Is mandatory to attend ( Kindly avoid the candidate who cannot attend one F2F round ) Candidate should be ready to join as a subcontractor. If relevant & total years of experience is not as mentioned, will be straightforward reject. Profile with higher rate will be straightforward reject. Please treat the below requirement as critical and share 2 quality profiles those are really interested in subcon role. Kindly go through below instruction very clearly and then do the submission with requested details Vendors should check the requirement clearly and not to send the profiles just by key word search Vendors should check the availability and interest of the resource to join as Subcon Kindly submit profiles within the rate card Ensure there is no ex- Infosys Emp profile submission as we have 6 months of cooling period We need only top 1-2 quality profiles, avoid multiple mail thread on profiles submission. ECMS Req # 514780 Number of Openings 1 Duration of Hiring 12 Months Relevant and Total years of experience 10+ Detailed job description - Skill Set: Create, test, and implement enterprise-level apps with Snowflake Design and implement features for identity and access management Create authorization frameworks for better access control Implement Client query optimization, major security competencies with encryption Solve performance issues and scalability issues in the system Transaction management with distributed data processing algorithms Possess ownership right from start to finish Migrate solutions from on-premises setup to cloud-based platforms Understand and implement the latest delivery approaches based on data architecture Project documentation and tracking based on understanding user requirements Perform data integration with third-party tools including architecting, designing, coding, and testing phases Manage documentation of data models, architecture, and maintenance processes Continually review and audit data models for enhancement Performance tuning, user acceptance training, application support Maintain confidentiality of data Risk assessment, management, and mitigation plans Regular engagement with teams for status reporting and routine activities Migration activities from one database to another or on-premises to cloud Mandatory Skills(ONLY 2 or 3) Snowflake Developer Vendor Billing range in local currency (per day) 11000 INR/DAY Work Location Chennai, Hyderabad, Bangalore, or Mysore Infosys location WFO/WFH/Hybrid WFO Hybrid WFO Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO NO Mode of interview F2F BGCHECK before or After onboarding Post Onboarding
Posted 1 month ago
8.0 - 13.0 years
45 - 50 Lacs
Bengaluru
Work from Office
The Lead data Scientist will be responsible for organizing and reporting data related to sales numbers, market research, logistics, linguistics, or other behaviors. They utilize technical expertise to ensure data reported is accurate and high-quality. Data will need to be analysed, designed, and presented in a way that assists individuals, business owners and customer stakeholders to make better decisions. Responsibilities: Cross-Functional Collaboration: Collaborate seamlessly with Engineering, Product, and Operations teams to conceptualise, design, and construct data reporting and analytical systems. Ideation and Analysis: Generate ideas for exploratory analysis, actively shaping the trajectory of future projects. Provide insightful recommendations for strategic actions based on data-driven insights. Rapid Prototyping and Product Discussions: Drive the rapid prototyping of solutions, actively participating in discussions related to product and feature development. Dashboard Creation and Reporting: Develop dashboards and comprehensive documentation to effectively communicate results. Regularly monitor key data metrics, facilitating informed decision-making. Integration with Production Systems: Collaborate closely with software engineers to deploy and integrate data models into production systems. Ensure scalability, reliability, and efficiency of integrated solutions. Business Metrics Identification: Identify and analyze key business metrics, offering strategic insights. Recommend product features based on the identified metrics to enhance overall product functionality. Understand and manage Data Infrastructure: Lead the design and development of scalable, robust, and efficient data architectures. Oversee the development and maintenance of ETL (Extract, Transform, Load) processes to move and transform data from various sources into data warehouses or other storage solutions. Ensure data quality and integrity throughout the data pipeline. Team Leadership: Lead a team of data engineers and data analysts, providing mentorship, guidance, and technical expertise. Coach and manage a team to deliver using Agile processes and ensure high RoI. Participating actively in recruitment and nurturing of engineers as awesome as you. Skills Exceptional quantitative and problem-solving skills - capable of tackling complex data-driven challenges and formulating effective solutions. Proven proficiency - In essential data science libraries, including Pandas, Numpy, SciPy, and Scikit-Learn, for data manipulation, analysis, and modeling. In-depth expertise in Python programming and SQL - Focus on data analysis, model building, and algorithmic implementation. Experience with distributed computing frameworks - Hadoop or Spark, for handling large-scale data processing and machine learning tasks. Data Architecture - Designing and implementing robust data architectures that support the organizations needs. This includes understanding different database systems, data lakes, and data warehouses.
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Chennai
Work from Office
We are looking for a BI Architect with 13+ years of experience to lead the design and implementation of scalable BI and data architecture solutions. The role involves driving data modeling, cloud-based pipelines, migration projects, and data lake initiatives using technologies like AWS, Kafka, Spark, SQL, and Python. Experience with EDW modeling and architecture is a strong plus. Key Responsibilities Design and develop scalable BI and data models to support enterprise analytics. Lead data platform migration from legacy BI systems to modern cloud architectures. Architect and manage data lakes, batch and streaming pipelines, and real-time integrations via Kafka and APIs. Support data governance, quality, and access control initiatives. Partner with data engineers, analysts, and business stakeholders to deliver reliable, high-performing data solutions. Contribute to architecture decisions and platform scalability planning Qualifications Should have 10-15 years of relevant experience. 10+ years in BI, data engineering, or data architecture roles. Proficiency in SQL, Python, Apache Spark, and Kafka. Strong hands-on experience with AWS data services (e.g., S3, Redshift, Glue, EMR). Track record of leading data migration and modernization projects. Solid understanding of data governance, security, and scalable pipeline design. Excellent collaboration and communication skills. Good to Have Experience with enterprise data warehouse (EDW) modeling and architecture. Familiarity with BI tools like Power BI, Tableau, Looker, or Quicksight. Knowledge of lakehouse, data mesh, or modern data stack concepts.
Posted 1 month ago
3.0 - 6.0 years
11 - 15 Lacs
Madurai, Tiruppur, Salem
Work from Office
Req ID: 126281 Remote Position: No Region: Asia Country: India State/Province: Chennai City: Guindy, Chennai General Overview Functional Area: Engineering Career Stream: Design Software Engineering Job Code: SSE-ENG-DSE Job Level: Level 11 IC/MGR: Individual Contributor Direct/Indirect Indicator: Indirect Summary Celestica is looking for skilled and enthusiastic software engineers to join our team in developing cutting-edge data centers that leverage advanced GPU technologies In this dynamic role, you will build orchestration software for the entire rack, develop integrated visualization tools for rack components, and create comprehensive diagnostics to optimize GPU server utilization The ideal candidate will have a strong background in Orchestration Software development and experience creating solutions for the data center industry, Detailed Description Architect/Develop a full stack application to ease the task of designing, deploying and monitoring a next generation data centre including GPU/AI compute elements, Use Cloud Native Development Methods to support Kubernetes deployments for different scenarios, Build Template Driven Rack Design techniques to support various Rack element compositions, Scalable software that can gather data from a large number of devices, monitor and make it easy to visualize trends, Build Network validation techniques for GPU centric traffic patterns, Agile software that can react immediately to operational issues and self-heal the deployments Optimize code for performance, efficiency, scalability Adopt GenAI tools for development efficiency, Work effectively in a team environment, collaborating with engineers and peer functional leads from different disciplines to innovate solutions, triage issues and speed execution Mentor and coach team members on the technical skills and approaches to solve problems, Present innovation and value addition from our software in technical forums and customer interactions Knowledge/Skills/Competencies Strong programming skills: Extensive Programming in Python , Go, Database system knowledge: Experience with SQL database like Postgres SQL and NoSQL databases like MongoDB , TSDB like Prometheus, Kubernetes Deployment Skills : Experience in Container orchestration, pod health checks, Networking, Helmcharts, Deployment Strategies, Familiar with UI Frameworks Rest API Frameworks and Backend for Frontend Design methodologies, Debugging and testing skills: Ability to identify and resolve software issues, Problem-solving skills: Strong analytical and problem-solving abilities Experience with data center deployments: Prior experience in data center architectures, developing and maintaining software for deployments is a must, Clear Communication: Proven ability to articulate requirements and vision to large and diverse audience through written documents like architecture specifications and verbal presentations in technical forums is required, Physical Demands Duties of this position are performed in a normal office environment, Duties may require extended periods of sitting and sustained visual concentration on a computer monitor or on numbers and other detailed data, Repetitive manual movements (e-g , data entry, using a computer mouse, using a calculator, etc ) are frequently required, Occasional travel may be required, Typical Experience 12 to 18 years Typical Education Bachelor degree or consideration of an equivalent combination of education and experience, Educational Requirements may vary by Geography Notes This job description is not intended to be an exhaustive list of all duties and responsibilities of the position Employees are held accountable for all duties of the job Job duties and the % of time identified for any function are subject to change at any time, Celestica is an equal opportunity employer All qualified applicants will receive consideration for employment and will not be discriminated against on any protected status (including race, religion, national origin, gender, sexual orientation, age, marital status, veteran or disability status or other characteristics protected by law), At Celestica we are committed to fostering an inclusive, accessible environment, where all employees and customers feel valued, respected and supported Special arrangements can be made for candidates who need it throughout the hiring process Please indicate your needs and we will work with you to meet them, Company Overview Celestica (NYSE, TSX: CLS) enables the worlds best brands Through our recognized customer-centric approach, we partner with leading companies in Aerospace and Defense, Communications, Enterprise, HealthTech, Industrial, Capital Equipment and Energy to deliver solutions for their most complex challenges As a leader in design, manufacturing, hardware platform and supply chain solutions, Celestica brings global expertise and insight at every stage of product development from drawing board to full-scale production and after-market services for products from advanced medical devices, to highly engineered aviation systems, to next-generation hardware platform solutions for the Cloud Headquartered in Toronto, with talented teams spanning 40+ locations in 13 countries across the Americas, Europe and Asia, we imagine, develop and deliver a better future with our customers, Celestica would like to thank all applicants, however, only qualified applicants will be contacted, Celestica does not accept unsolicited resumes from recruitment agencies or fee based recruitment services,
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
At 5X our top priority is now to build out the platform Picture every company using 5X as getting a core stack (Data Ingestion, Warehouse, Modelling & Orchestration and BI) out of the box with user permissions and Utilization insights We are looking for Data Engineers with with a proven track record of successfully delivering end-to-end data solutions This is a full time role, Responsibilities Lead the analysis of client's business requirements to deliver strategic data solutions that align with organizational goals Oversee the design and development of data models and advanced Power BI dashboards, ensuring actionable insights for senior business stakeholders Drive the implementation and optimization of complex data models in Power BI, leveraging Medallion architecture principles (bronze, silver, gold layers) to ensure scalability and performance Manage end-to-end project delivery, including defining project scope, timelines, and milestones, while ensuring alignment with client expectations and business objectives Establish and enforce best practices in data engineering, ensuring high-quality, reliable, and scalable data solutions that meet industry standards Proactively communicate project updates, risks, and resolutions to stakeholders, driving transparency and trust Lead and mentor a team of data engineers and analysts, fostering a culture of collaboration, innovation, and accountability Facilitate high-level stakeholder conversations, translating technical concepts into business value and managing expectations to ensure successful project outcomes Qualifications 6+ years of experience in data engineering, analytics, or related fields, with a proven track record of leading and delivering complex, end-to-end data solutions Demonstrated project management experience, ensure timely delivery, and manage resources effectively Proven leadership skills, with experience managing and mentoring high-performing teams to achieve project and organizational goals Strong stakeholder management skills, with the ability to engage senior leaders, build relationships, and communicate complex technical concepts clearly Advanced proficiency in SQL and Python for data manipulation, transformation, and analysis Extensive experience with Snowflake and Power BI, including advanced dashboard development and optimization Deep expertise in medallion data architecture (Bronze, Silver, Gold layers) and advanced data modeling techniques Advanced knowledge of DAX calculations and measures for sophisticated analytics and performance optimization Exceptional problem-solving skills, with the ability to navigate ambiguity and drive solutions in a fast-paced environment Outstanding communication and interpersonal skills, with a track record of proactive client engagement and stakeholder management Self-motivated and results-oriented, with a demonstrated ability to unblock challenges, drive progress, and deliver with minimal supervision Benefits 100% remote company We love to give our employees the freedom to choose where they want to work from Wellness We have monthly wellness programmes & workshops to make sure that all our employees are happy and satisfied Competitive compensation -We offer competitive compensation and meaningful equity Parental Leave: We value and support the family planning process and we provide paid parental leave Healthcare We cover all employeeshealth benefits and their dependents Offsite1 team offsite a year to incredible destinations Check out our recent offsites here: Thailand, Sri Lanka and Bali About 5X 5X is a data and AI platform focused on traditional industries (like Banking, Manufacturing, Retail, Real Estate, Healthcare, Education,Government) Traditional businesses are struggling with data silos, poor data quality slowing the entire business now These businesses are using legacy hard to reach systems or use platforms like SAP, Salesforce, Oracle which make it complex to get data out, 5X is able to extract data from hard to reach systems, centralize, clean, structure & model it and enable data & agentic Gen AI capabilities, Unlike legacy data platform implementations which take months and hundreds of thousands of dollars we are able to demonstrate end to end use cases in 48 hours at a fraction of the price 5X was founded in 2020 with presence in the USA, Singapore, UK and India Our global team is 70+ people strong and rapidly growing, We're backed by Flybridge Capital and creators of popular open source projects like Airflow, Superset, Parquet & founders from companies like Datadog, Astronomer, Mode, Rudderstack Know About The Company Website: https://5X co/ LinkedIn: https: / / linkedin , / company / datawith5X / mycompany / Glassdoor: https://glassdoor co in/Reviews/5X-Reviews-E6110869 htm 5X in 2 minutes: https: / / youtube , / watchv=45Ppi00Lw70
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Gurugram
Work from Office
Global Data Steward Role We are looking for a highly skilled and experienced Global Data Steward to join our team at AXALTA COATING SYSTEMS INDIA PRIVATE LIMITED. The ideal candidate will have 6-8 years of experience in data stewardship. Roles and Responsibility Develop and implement effective data stewardship strategies to ensure data quality and integrity. Collaborate with cross-functional teams to identify and prioritize data requirements. Design and maintain scalable and secure data architectures to support business growth. Ensure compliance with regulatory requirements and industry standards. Provide expert guidance on data management best practices to stakeholders. Analyze and resolve complex data-related issues to improve operational efficiency. Job Requirements Strong understanding of data stewardship principles and practices. Experience with data governance frameworks and regulations. Proficiency in data modeling, warehousing, and analytics tools. Excellent communication and collaboration skills. Ability to work in a fast-paced environment with multiple priorities. Strong problem-solving skills with attention to detail.
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
ECMS Req # /Demand ID 519826 Number of Openings 1 Duration of project 12 Months No of years experience Total 3-5Years Relevant 4+ Detailed job description - Skill Set: Backend Developer (4+ years) with strong expertise in PySpark and SQL technologies to develop and maintain high-performance big data architecture. Should have hands-on experience in hive, impala, airflow. They should have project experience in agile methodologies. Mandatory Skills Python-Big Data Spark and Good Communication skill Vendor Proposed Rate (as per ECMS system) 8000 INR / Day Work Location Any Infosys DC Hybrid/remote/WFO Hybrid BGV Pre/Post onboarding Pre Onboarding - Final BGV
Posted 1 month ago
8.0 - 13.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Job Description We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Qualifications Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python
Posted 1 month ago
4.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics AI Management Level Senior Associate Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Summary We are in search of a Senior Python Developer to build customercentric applications that prioritize functionality and efficiency. The role will involve active participation in all stages of the software development lifecycle. Responsibilities Developing highquality, scalable software solutions Desiging of individual modules in a project Ensuring code quality, security, and performance through rigorous testing and review Optimizing code to enhance efficiency and maintainability Staying uptodate on industry trends, emerging technologies, and best practices Taking ownership of projects from conception to delivery, ensuring successful outcomes Required skills experience Python Programming Strong proficiency in Python programming and a proven track record of successful projects. Java Programming Understanding of Java programming Databases Expertise in relational SQL databases, ability to design and model for project modules, developing and optimizing stored procedures. Web Development Experience with Python and web frameworks like Django, Flask. API Development Experience with designing and implementing RESTful APIs. Cloud Platform Familiarity with cloud platforms and containerization technologies (e.g., AWS, Docker, and Kubernetes) DevOps Experience with (CI/CD) pipelines and tools Software Engineering Understanding of SDLC principles and methodologies. Problemsolving Ability to analyze problems and find solutions. Communication Strong communication and collaboration skill to work with crossfunctional teams Mandatory skill sets Python, SQL, Python Programming, Web Development Preferred skill sets Python, SQL, Python Programming, Web Development Years of experience required 4 to 8 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Bachelor of Engineering, Bachelor in Business Administration Degrees/Field of Study preferred Required Skills Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 1 month ago
1.0 - 4.0 years
9 - 13 Lacs
Bengaluru
Work from Office
We are looking for a dynamic, self-starter Senior BIE for IES Shopping Analytics and Science Team (AST) to guide Amazon Bazaar program in India with analytics and data-driven insights. You will be working in one of the worlds largest and most complex data warehouse environments. You must have a track record of churning out insights and make actionable recommendations to audiences of varying technical aptitude that directly impact organizational strategic decisions and priorities. Being able to thrive in an ambiguous, fast-moving environment and prioritizing work is essential, as is a mind for innovation and learning through rapidly evolving and new technologies. This role provides an opportunity to develop original ideas, approaches, and solutions in a competitive and ever-changing business climate. Conduct deep dive analyses of business problem statements and formulate conclusions and recommendations to leadership Share written recommendations and insights for key stakeholders that will help shape organizational strategic decisions and priorities Contribute to the design, implementation, and delivery of BI solutions for complex and ambiguous problems Simplify and automate reporting, audits, and other data-driven activities Partner with other BIEs to enhance data infrastructure, data availability, and broad access to customer insights Develop and drive best practices in data integrity, consistency, analysis, validations, and documentation Learn new technology and techniques to meaningfully support internal stakeholders and process innovation About the team IES Shopping Analytics and Science Team (AST) has a vision to embed a data culture deeply in our IES Shopping Experience organization, fostering invention through insights, and building a robust data architecture to support business needs. We spin the insights flywheel by growing a pool of bar-raisers and diverse data professionals, which empowers us to continuously enhance our data capabilities, holistically covering disciplines of Data Engineering, Business Intelligence, Analytics, and Machine Learning. 10+ years of professional or military experience 8+ years of SQL experience Experience programming to extract, transform and clean large (multi-TB) data sets Experience with theory and practice of design of experiments and statistical analysis of results Experience in scripting for automation (e.g. Python) and advanced SQL skills. Experience with theory and practice of information retrieval, data science, machine learning and data mining Experience working directly with business stakeholders to translate between data and business needs Experience managing, analyzing and communicating results to senior leadership Experience working as a BIE in a technology company Experience with AWS technologies Experience using Cloud Storage and Computing technologies such as AWS Redshift, S3, Hadoop, etc.
Posted 1 month ago
6.0 - 9.0 years
8 - 11 Lacs
Chennai
Work from Office
Job Title: Data Modeller - GCP Experience: 6-9 Years Work Type: On-site Work Location: Chennai (Work from Client Office - Mandatory) Job Description We are seeking a skilled Data Modeller with strong experience in data modelling for OLTP and OLAP systems, particularly within Google Cloud Platform (GCP). The ideal candidate will be hands-on with designing efficient, scalable data architectures and have a solid grasp of performance tuning and cloud-based databases. Key Responsibilities: Design and implement Conceptual, Logical, and Physical Data Models for OLTP and OLAP systems Apply best practices in data indexing, partitioning, and sharding for optimized performance Use data modelling tools (preferably DBSchema) to support and document database design Ensure data architecture supports near real-time reporting and application performance Collaborate with cross-functional teams to translate business requirements into data structures Work with GCP database technologies like AlloyDB, CloudSQL, and BigQuery Validate and improve database performance metrics through continuous optimization Must-Have Skills: GCP: AlloyDB, CloudSQL, BigQuery Strong hands-on experience with data modelling tools (DBSchema preferred) Expertise in OLTP & OLAP data models, indexing, partitioning, and data sharding Deep understanding of database performance tuning and system architecture Good to Have: Functional knowledge of the mutual fund industry Exposure to data governance and security best practices in the cloud
Posted 1 month ago
4.0 - 9.0 years
6 - 10 Lacs
Pune
Work from Office
As an Data Engineer at IBM you wi harness the power of data to unvei captivating stories and intricate patterns. You' contribute to data gathering, storage, and both batch and rea-time processing. Coaborating cosey with diverse teams, you' pay an important roe in deciding the most suitabe data management systems and identifying the crucia data required for insightfu anaysis. As a Data Engineer, you' tacke obstaces reated to database integration and untange compex, unstructured data sets. In this roe, your responsibiities may incude: Impementing and vaidating predictive modes as we as creating and maintain statistica modes with a focus on big data, incorporating a variety of statistica and machine earning techniques Designing and impementing various enterprise search appications such as Easticsearch and Spunk for cient requirements Work in an Agie, coaborative environment, partnering with other scientists, engineers, consutants and database administrators of a backgrounds and discipines to bring anaytica rigor and statistica methods to the chaenges of predicting behaviours. Buid teams or writing programs to ceanse and integrate data in an efficient and reusabe manner, deveoping predictive or prescriptive modes, and evauating modeing resuts Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise 4+ years of experience in data modeing, data architecture. Proficiency in data modeing toos Erwin, IBM Infosphere Data Architect and database management systems Famiiarity with different data modes ike reationa, dimensiona and NoSQL databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principes, data warehousing concepts, and data governance practices Preferred technica and professiona experience Exceent anaytica and probem-soving skis with a keen attention to detai. Abiity to work coaborativey in a team environment and manage mutipe projects simutaneousy. Knowedge of programming anguages such as SQL
Posted 1 month ago
10.0 - 15.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Job Description We are seeking a highly skilled and experienced Data Architect to design, implement, and manage the data infrastructure As a Data Architect, you will play a key role in shaping the data strategy, ensuring data is accessible, reliable, and secure across the organization You will work closely with business stakeholders, data engineers, and analysts to develop scalable data solutions that support business intelligence, analytics, and operational needs, Key Responsibilities Design and implement effective database solutions (on-prem / cloud) and data models to store and retrieve data for various applications within FinCrime domain, Develop and maintain robust data architecture strategies aligned with business objectives, Define data standards, frameworks, and governance practices to ensure data quality and integrity, Collaborate with data engineers, software developers, and business stakeholders to integrate data systems and optimize data pipelines, Evaluate and recommend tools and technologies for data management, warehousing, and processing, Create and maintain documentation related to data models, architecture diagrams, and processes, Ensure data security and compliance with relevant regulations (e-g , GDPR, HIPAA, CCPA), Participate in capacity planning and growth forecasting for the organizations data infrastructure, Through various POCs, assess and compare multiple tooling options and deliver use-cases based on MVP model as per expectations, Requirements Experience: 10+ years of experience in data architecture, data engineering, or related roles, Proven experience with relational and NoSQL databases, Experience with FinCrime domain applications and reporting, Strong experience with ETL tools, data warehousing, and data lake solutions, Familiarity with other data technologies such as Spark, Kafka, Snowflake, Skills Strong analytical and problem-solving skills, Proficiency in data modelling tools (e-g , ER/Studio, Erwin), Excellent understanding of database management systems and data security, Knowledge of data governance, metadata management, and data lineage, Strong communication and interpersonal skills to collaborate across teams, Subject matter expertise within the FinCrime, Preferred Qualifications
Posted 1 month ago
2.0 - 6.0 years
8 - 12 Lacs
Hyderabad
Work from Office
About The Job Sanofi is a global life sciences company committed to improving access to healthcare and supporting the people we serve throughout the continuum of care From prevention to treatment, Sanofi transforms scientific innovation into healthcare solutions, in human vaccines, rare diseases, multiple sclerosis, oncology, immunology, infectious diseases, diabetes and cardiovascular solutions and consumer healthcare More than 110,000 people in over 100 countries at Sanofi are dedicated to making a difference on patientsdaily life, wherever they live and enabling them to enjoy a healthier life, As a company with a global vision of drug development and a highly regarded corporate culture, Sanofi is recognized as one of the best pharmaceutical companies in the world and is pioneering the application of Artificial Intelligence (AI) with strong commitment to develop advanced data standards to increase reusability & interoperability and thus accelerate impact on global health, The R&D Data Office serves as a cornerstone to this effort Our team is responsible for cross-R&D data strategy, governance, and management We sit in partnership with Business and Digital, and drive data needs across priority and transformative initiatives across R&D Team members serve as advisors, leaders, and educators to colleagues and data professionals across the R&D value chain As an integral team member, you will be responsible for defining how R&D's structured, semi-structured and unstructured data will be stored, consumed, integrated / shared and reported by different end users such as scientists, clinicians, and more You will also be pivotal in the development of sustainable mechanisms for ensuring data are FAIR (findable, accessible, interoperable, and reusable), Position Summary The R&D Data Modeler develops conceptual and logical data models for initiatives, programs, and cross-R&D capabilities This role is critical for creation of data models and ontologies, and upkeep of models even beyond the conclusion of a project Data modelers will apply and assist in the definition and governance of data modelling and design standards, tools, best practices, and related development of any R&D data capability, Main Responsibilities Engage in data management and analytics projects; understand, advise, and execute on data flow optimization (e-g , data capture, integration and use across R&D) Understand the data-related needs for various cross-R&D capabilities (e-g , data catalog, master data management etc) and associated initiatives Design conceptual and logical data models and data dictionaries/ontologies to cater to R&D business needs and functional requirements; lead validation of physical data models Interact with business, R&D Digital, and other data collaborators to translate needs into data solutions Understand market trends for data modelling tools and metadata management capabilities; provides input into selection of tools and any necessary migration into companys environment Understand data governance policies, standards and procedures for R&D data Serve as point of contact for data integration topics within solutions (e-g , technology), from source systems to data consumers; define process, tools, and testing requirements Maintain modelling and naming standards, and data harmonization, metadata management, and source-to-target data mapping documentation Evaluate and influence projects while serving as ?voice-of-business?; map systems/interfaces for data management, set standards for future state, and close gap from current-to-future state Serve as technical and data consultant for R&D initiatives with major platforms/technology implementations Maintain documentation and act as an expert on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc for R&D functions Educate and guide R&D teams on standards and information management principles, methodologies, best practices, etc Deliverables Conduct requirements gathering from business analysts, data scientists, and other stakeholders Formulate strategies, and query optimizations to enhance data retrieval speed Develop complex and scalable data models aligned to organizations long term strategic goals Formulate data governance frameworks, policies, and standards Establish best practices for data modeling ensuring interoperability among systems, applications & data sources About you Experience: 5+ years of experience in business data management, information architecture, technology or other related field Functional skills: Demonstrated ability to understand end-to-end data use and business needs Knowledge of R&D data and data domains (e-g , across research, clinical, regulatory etc) Experience with creating and applying data modelling best practices and naming conventions Strong analytical problem-solving skills Demonstrated strong attention to detail, quality, time management and customer focus Excellent written and oral communications skills Strong networking, influencing and negotiating skills and superior problem-solving skills Demonstrated willingness to make decisions and to take responsibility for such Excellent interpersonal skills (team player) Technical Skills Experience with data management practices and technologies (e-g , Collibra, Informatica etc) Familiar with databases (relational, dimensional, NoSQL, etc) and concepts of data integrity Strong knowledge of data architecture (e-g , data lake, data virtualization, hubs etc) and modelling (e-g , 3nf etc) is required Experience in big data infrastructures (e-g , Hadoop, NOSQL etc) Experience with SDLC and pharma R&D platforms; experience with requirements gathering, system design, and validation/quality/compliance requirements Experience managing technology and/or data warehouse projects Familiarity with relationship databases and entity-relationship data modelling Experience with hierarchical data models from conceptualization to database optimization Education: Bachelors in Computer Science, Engineering, Mathematics, Statistics, or related; Masters preferred Languages: English Pursue Progress Discover Extraordinary, Progress doesnt happen without people people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen You can be one of those people Chasing change, embracing new ideas and exploring all the opportunities we have to offer Lets pursue progress And lets discover extraordinary together, At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity, Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi,! null
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40098 Jobs | Dublin
Wipro
19612 Jobs | Bengaluru
Accenture in India
17156 Jobs | Dublin 2
EY
15921 Jobs | London
Uplers
11674 Jobs | Ahmedabad
Amazon
10661 Jobs | Seattle,WA
Oracle
9470 Jobs | Redwood City
IBM
9401 Jobs | Armonk
Accenture services Pvt Ltd
8745 Jobs |
Capgemini
7998 Jobs | Paris,France