Jobs
Interviews

1769 Data Architecture Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

8 - 13 Lacs

Chennai

Work from Office

This position provides input, support, and performs full systems life cycle management activities (e-g-, analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc-)- He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements- This position provides input to applications development project plans and integrations- He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives- This position provides knowledge and support for applications development, integration, and maintenance- He/She provides input to department and project teams on decisions supporting projects- Responsibilities: Performs systems analysis and design- Designs and develops moderate to highly complex applications- Develops application documentation- Produces integration builds- Performs maintenance and support- Supports emerging technologies and products- Qualifications: Minimum 4-8 years of experience with Java, Spring Boot, Restful Web Service Client/Server Development Experience in Openshift and Angular- Experience with SQL, PL/SQL, and Oracle database - nice to have- Knowledge of Code Quality Inspection Tools, Dependency Management Systems and Software Vulnerability Detection and Remediation Familiarity with Agile Development and Sprint Ceremonies Must be detail oriented- Excellent verbal and written communication skills Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 12 Lacs

Chennai

Work from Office

This position provides input, support, and performs full systems life cycle management activities (e-g-, analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc-)- He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements- This position provides input to applications development project plans and integrations- He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives- This position provides knowledge and support for applications development, integration, and maintenance- He/She provides input to department and project teams on decisions supporting projects- Responsibilities: Performs systems analysis and design- Designs and develops moderate to highly complex applications- Develops application documentation- Produces integration builds- Performs maintenance and support- Supports emerging technologies and products- Qualifications: Bachelor s Degree or International equivalent Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred Skills - Java/J2EE, Spring, Spring Boot, Microservices, SQL Server, Stored Procedures, Linux, OpenShift, Jenkins, GIT

Posted 1 week ago

Apply

8.0 - 12.0 years

25 - 30 Lacs

Chennai

Work from Office

This position provides input, support, and performs full systems life cycle management activities (e-g-, analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc-)- He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements- This position provides input to applications development project plans and integrations- He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives- This position provides knowledge and support for applications development, integration, and maintenance- He/She provides input to department and project teams on decisions supporting projects- Responsibilities: Mandatory skills : Full stack developer , Dot net Core and above angular 14/16 version Performs systems analysis and design- Designs and develops moderate to highly complex applications- Develops application documentation- Produces integration builds- Performs maintenance and support- Supports emerging technologies and products- Qualifications: Bachelor s Degree or International equivalent Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Chennai

Work from Office

This position provides input, support, and performs full systems life cycle management activities (e-g-, analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc-)- He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements- This position provides input to applications development project plans and integrations- He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives- This position provides knowledge and support for applications development, integration, and maintenance- He/She provides input to department and project teams on decisions supporting projects- Responsibilities: Performs systems analysis and design- Designs and develops moderate to highly complex applications- Develops application documentation- Produces integration builds- Performs maintenance and support- Supports emerging technologies and products- Qualifications: Minimum 8-12 years of experience with Java, Spring Boot, Restful Web Service Client/Server Development Expertise in SQL, PL/SQL, and Oracle database- Expertise in AMQ Nice to have OpenShift, Azure DevOps Server, Jenkins CI/CD Pipeline- Knowledge of Code Quality Inspection Tools, Dependency Management Systems and Software Vulnerability Detection and Remediation Familiarity with Agile Development and Sprint Ceremonies Must be detail oriented- Excellent verbal and written communication skills Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Chennai

Work from Office

This position provides input, support, and performs full systems life cycle management activities (e-g-, analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc-)- He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements- This position provides input to applications development project plans and integrations- He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives- This position provides knowledge and support for applications development, integration, and maintenance- He/She provides input to department and project teams on decisions supporting projects- Responsibilities: Senior QA Automation Engineer with 5+ years automation experience Ability to create scenarios and work independently with some help Coding experience using -NET or Java is a must Experience with querying and creating data from SQL / DB2 Experience with Selenium, Agile, Azure DevOps, JMeter Languages known - Java or -NET Good communication skills Experience with working with GCP and/or OpenShift Experience with creating positive and negative scenarios for Automated Testing Ability to work independently Knowledge of MS SQL and/or DB2 SQL Familiarity with API testing, REST, Java, -NET, Azure DevOps, Jenkins, Security Testing and JMeter Qualifications: Bachelor s Degree or International equivalent Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

6.0 - 8.0 years

9 - 14 Lacs

Chennai

Work from Office

This position provides input, support, and performs full systems life cycle management activities (e-g-, analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc-)- He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements- This position provides input to applications development project plans and integrations- He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives- This position provides knowledge and support for applications development, integration, and maintenance- He/She provides input to department and project teams on decisions supporting projects- Responsibilities : Performs systems analysis and design- Designs and develops moderate to highly complex applications- Develops application documentation- Produces integration builds- Performs maintenance and support- Supports emerging technologies and products- Qualifications : Minimum 6-8 years of experience with Java Spring Boot, Restful Web Service Client/Server Development Proficient with SQL, PL/SQL, and Oracle database- Proficient with AMQ Nice to have OpenShift, Azure DevOps Server, Jenkins CI/CD Pipeline Knowledge of Code Quality Inspection Tools, Dependency Management Systems and Software Vulnerability Detection and Remediation Familiarity with Agile Development and Sprint Ceremonies Must be detail oriented- Excellent verbal and written communication skills Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Coursera was launched in 2012 by Andrew Ng and Daphne Koller, with a mission to provide universal access to world-class learning. It is now one of the largest online learning platforms in the world, with 175 million registered learners as of March 31, 2025. Coursera partners with over 350 leading universities and industry leaders to offer a broad catalog of content and credentials, including courses, Specializations, Professional Certificates, and degrees. Coursera s platform innovations enable instructors to deliver scalable, personalized, and verified learning experiences to their learners. Institutions worldwide rely on Coursera to upskill and reskill their employees, citizens, and students in high-demand fields such as GenAI, data science, technology, and business. Coursera is a Delaware public benefit corporation and a B Corp. Join us in our mission to create a world where anyone, anywhere can transform their life through access to education. Were seeking talented individuals who share our passion and drive to revolutionize the way the world learns. At Coursera, we are committed to building a globally diverse team and are thrilled to extend employment opportunities to individuals in any country where we have a legal entity. We require candidates to possess eligible working rights and have a compatible timezone overlap with their team to facilitate seamless collaboration. Coursera has a commitment to enabling flexibility and workspace choices for employees. Our interviews and onboarding are entirely virtual, providing a smooth and efficient experience for our candidates. As an employee, we enable you to select your main way of working, whether its from home, one of our offices or hubs, or a co-working space near you. Job Overview: Does architecting high quality and scalable data pipelines powering business critical applications excite youHow about working with cutting edge technologies alongside some of the brightest and most collaborative individuals in the industryJoin us, in our mission to bring the best learning to every corner of the world! We re looking for a passionate and talented individual with a keen eye for data to join the Data Engineering team at Coursera! Data Engineering plays a crucial role in building a robust and reliable data infrastructure that enables data-driven decision-making, as well as various data analytics and machine learning initiatives within Coursera. In addition, Data Engineering today owns many external facing data products that drive revenue and boost partner and learner satisfaction. You firmly believe in Courseras potential to make a significant impact on the world, and align with our core values: Learners first: Champion the needs, potential, and progress of learners everywhere. Play for team Coursera: Excel as an individual and win as a team. Put Coursera s mission and results before personal goals. Maximize impact: Increase leverage by focusing on things that produce bigger results with less effort. Learn, change, and grow: Move fast, take risks, innovate, and learn quickly. Invite and offer feedback with respect, courage, and candor. Love without limits: Celebrate the diversity and dignity of every one of our employees, learners, customers, and partners. Your responsibilities Architect scalable data models and construct high quality ETL pipelines that act as the backbone of our core data lake, with cutting edge technologies such as Airflow, DBT, Databricks, Redshift, Spark. Your work will lay the foundation for our data-driven culture. Design, build, and launch self-serve analytics products. Your creations will empower our internal and external customers, providing them with rich insights to make informed decisions. Be a technical leader for the team. Your guidance in technical and architectural designs for major team initiatives will inspire others. Help shape the future of Data Engineering at Coursera and foster a culture of continuous learning and growth. Partner with data scientists, business stakeholders, and product engineers to define, curate, and govern high-fidelity data. Develop new tools and frameworks in collaboration with other engineers. Your innovative solutions will enable our customers to understand and access data more efficiently, while adhering to high standards of governance and compliance. Work cross-functionally with product managers, engineers, and business teams to enable major product and feature launches. Your skills 5+ years experience in data engineering with expertise in data architecture and pipelines Strong programming skills in Python Proficient with relational databases, data modeling, and SQL Experience with big data technologies (eg: Hive, Spark, Presto) Familiarity with batch and streaming architectures preferred Hands-on experience with some of: AWS, Databricks, Delta Lake, Airflow, DBT, Redshift, Datahub, Elementary Knowledgeable on data governance and compliance best practices Ability to communicate technical concepts clearly and concisely Independence and passion for innovation and learning new technologies If this opportunity interest you, you might like these courses on Coursera - Big Data Specialization Data Warehousing for Business Intelligence IBM Data Engineering Professional Certificate #LI-SP2 Coursera is an Equal Employment Opportunity Employer and considers all qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, age, marital status, national origin, protected veteran status, disability, or any other legally protected class. . For California Candidates, please review our CCPA Applicant Notice here. For our Global Candidates, please review our GDPR Recruitment Notice here. #LI-Remote

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Hyderabad

Work from Office

At Intercontinental Exchange (ICE), we engineer technology, exchanges and clearing houses that connect companies around the world to global capital and derivative markets. With a leading-edge approach to developing technology platforms, we have built market infrastructure in all major trading centers, offering customers the ability to manage risk and make informed decisions globally. By leveraging our core strengths in technology, we continue to identify new ways to serve our customers and transform global markets. AIP Suites (Data Modernization to Snowflake) builds an analytics-ready data architecture where data from source systems such as PDM (Product Data Management) and RDO is ingested into Snowflake for centralized storage and modeling. These models support ICE BI, which consumes Snowflake data for analytics and dashboarding. This design ensures clean separation between raw ingestion, transformation, analytics, and service-based consumption, supporting scalable and future-proof data-driven operations. ICE Mortgage Technology is seeking a Data Engineer who will be responsible for design and optimize SQL queries, develop stored procedures, and participate in the migration and modernization of legacy applications to support IMT (ICE Mortgage Technology) Products. The candidate should have a strong background in SQL and Stored Procedures Responsibilities Provides Snowflake-based data warehouse design and development for projects involving new data integration, migration, and enhancement of existing pipelines. Designs and develops data transformation logic using SQL, Snowflake stored procedures, and Python-based scripts for ETL/ELT workloads. Builds and maintains robust data pipelines to support reporting, analytics, and application data needs. Creates and maintains Snowflake objects like tables, views, streams, tasks, file formats, and external stages. Participates in project meetings with data engineers, analysts, business users, and product owners to understand and implement technical requirements. Writes technical design documentation based on business requirements and data architecture principles. Develops and/or reviews unit testing protocols for SQL scripts, procedures, and data pipelines using automation frameworks. Completes documentation and procedures for pipeline deployment, operational handover, and monitoring. May mentor or guide junior developers and data engineers. Stays current with Snowflake features, best practices, and industry trends in cloud data platforms. Performs additional related duties as assigned. Knowledge and Experience Bachelor s Degree or the equivalent combination of education, training, or work experience. 5+ years of professional experience in data engineering or database development. Strong Hands-on experience: Writing complex SQL queries and stored procedures Database stored procedures, functions, views, and schema design Using Streams, Tasks, Time Travel, and Cloning Proficiency in database performance tuning and performance optimization clustering, warehouse sizing, caching, etc. Experience configuring external stages to integrate with cloud storage (AWS S3, Azure Blob, etc.). Experience writing Python/Shell scripts for data processing (where needed). Knowledge on Snowflake and Tidal is an added advantage Proficiency in using Git and working within Agile/Scrum SDLC environments. Familiarity working in a Software Development Life Cycle (SDLC) leveraging Agile principles. Excellent analytical, decision-making, and problem-solving skills. Ability to multitask in a fast-paced environment with a focus on timeliness, documentation, and communication with peers and business users. Strong verbal and written communication skills to engage both technical and non-technical audiences at various organizational levels.

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Visakhapatnam

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Surat

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Varanasi

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Gurugram

Remote

Job description Data Modeler AI/ML Enablement Remote | Contract/Freelancer | Duration: 1 to 2 Months Start: Immediate | Experience: 8+ Years Were looking for experienced Data Modelers with a strong background in one or more industries: Telecom, Banking/Finance, Media or Government Only. Key Responsibilities: Design conceptual/logical/physical data models Collaborate with AI/ML teams to structure data for model training Build ontologies, taxonomies, and data schemas Ensure compliance with industry-specific data regulations Must-Have Skills & Experience: 7+ years of hands-on experience in data modeling conceptual, logical, and physical models. Proficiency in data modeling tools like Erwin, ER/Studio, or PowerDesigner. Strong understanding of data domains like customer, transaction, network, media, or case data. Familiarity with AI/ML pipelines understanding how structured data supports model training. Knowledge of data governance, quality, and compliance standards (e.g., GDPR, PCI-DSS). Ability to work independently and deliver models quickly in a short-term contract environment.

Posted 1 week ago

Apply

14.0 - 24.0 years

35 - 50 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Key Responsibilities: Platform Architecture Design : Lead the design and architecture of the digital platform , ensuring that the data infrastructure is scalable, secure, and reliable. Focus on utilizing AWS services (e.g., S3 , Redshift , Glue , Lambda , Kinesis ) and Databricks to build a robust, cloud-based data architecture. Data Integration & ETL Pipelines : Architect and implement ETL/ELT pipelines to integrate data from multiple sources (e.g., transactional databases, third-party services, APIs) into the platform, using AWS Glue , Databricks , and other tools for efficient data processing. Cloud Strategy & Deployment : Implement cloud-native solutions, leveraging AWS tools and Databricks for data storage, real-time processing, machine learning, and analytics. Design the platform to be cost-efficient, highly available, and easily scalable. Data Modelling : Develop and maintain data models for the platform that support business intelligence, reporting, and analytics. Ensure the data model design aligns with business requirements and the overall architecture of the platform. Machine Learning & Analytics Enablement : Work with data scientists and analysts to ensure that the architecture supports advanced analytics and machine learning workflows, enabling faster time to insights and model deployment. Data Security & Governance : Implement data governance frameworks to ensure data privacy, compliance, and security in the digital platform. Use AWS security tools and best practices to safeguard sensitive data and manage access control. Platform Performance & Optimization : Monitor and optimize platform performance, including the efficiency of data processing, data retrieval, and analytics workloads. Ensure low-latency and high-throughput data pipelines. Collaboration & Stakeholder Management : Collaborate closely with stakeholders across data engineering, data science, and business teams to align the platform architecture with business needs and evolving technological requirements. Skills & Qualifications: Required: Bachelors / Master’s degree in computer science , Engineering or a related field. 10+ years of experience in data architecture, data engineering, or a related field, with a strong background in designing scalable, cloud-based data platforms. Extensive experience with AWS services such as S3 , Redshift , Glue , Lambda , Kinesis , and RDS , with a deep understanding of cloud architecture patterns. Strong proficiency in Databricks , including experience with Apache Spark , Delta Lake , and MLflow for building data pipelines, managing large datasets, and supporting machine learning workflows. Expertise in data modelling techniques, including designing star/snowflake schemas, dimensional models, and ensuring data consistency and integrity across the platform. Experience with ETL/ELT processes , integrating data from a variety of sources, and optimizing data flows for performance. Proficiency in programming languages such as Python and SQL for data manipulation, automation, and data pipeline development. Strong knowledge of data governance and security practices, including data privacy regulations (GDPR, CCPA) and tools like AWS IAM , AWS KMS , and AWS CloudTrail . Experience with CI/CD pipelines and automation tools for deployment, testing, and monitoring of data architecture and pipelines. Preferred: Experience with real-time streaming data solutions such as Apache Kafka or AWS Kinesis within the Databricks environment. Experience with data lake management, particularly using AWS Lake Formation and Databricks Delta Lake for large-scale, efficient data storage and management. Soft Skills: Strong communication skills, with the ability to explain complex technical concepts to business leaders and stakeholders. Excellent problem-solving skills with the ability to architect complex, scalable data solutions. Leadership abilities with a proven track record of mentoring and guiding data teams. Collaborative mindset, capable of working effectively with cross-functional teams, including engineering, data science, and business stakeholders. Attention to detail, with a focus on building high-quality, reliable, and scalable data solutions.

Posted 1 week ago

Apply

13.0 - 20.0 years

20 - 25 Lacs

Mumbai

Work from Office

Tata STRIVE Tata STRIVE is an initiative of the TCIT, aimed at actively bridging the gap between vocational education and industry needs. Tata STRIVE runs various programmes to skill the youth from underprivileged backgrounds enabling gainful livelihood for each aspirant differentiated by its innovations in technology, pedagogy and methodology. Role: Lead Strategy & Architecture Objective The Lead Strategy Architecture serves as the channel that enables the organisations vision to be implemented year on year by prioritizing on the immediate annual objectives and ensuring the framework for the same is laid down. Major Deliverables Define the IT Strategy & road map aligned to Tata STRIVE business plans Conducting independent research and analysis to identify what can be added to the Tata STRIVE Preparing an annual Roadmap by prioritizing on specific areas of change and performance. Focusing on building internal team capabilities and provide process inputs Designing a Strategy Document that captures the annual roadmap with support and justification for the proposed action plan Periodically review roadmap and ensure alignment of objectives at various levels Introduce new technology concepts for enriching stakeholder experience Collaborate with internal teams for creating STRIVE business plan Reporting To: Head Technology & Innovation Location: Mumbai Essential Attributes Manage enterprise architecture (Business, Information & Technology architecture) for small & medium organization Technical knowledge across multiple technologies, tools & frameworks Understand of Skill development ecosystem Knowledge about economics Social and communication skills Team leadership and training Project management. People Skills Strategic Thinking Desired Attributes • Innovation Creativity Diplomacy and Patience Listening Mentoring Qualification: Bachelor degree in engineering. Any management qualification [ MBA], Enterprise architecture certification [Togaf ] are add-on. Desired Experience(years) 15 + years No. of direct reports: 2 – 10

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Ludhiana

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Lucknow

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

kochi, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. EY's financial services practice offers integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY's Consulting Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for a candidate with 10-12 years of expertise in data science, data analysis, and visualization skills. Act as a Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects. Your key responsibilities include: - Understanding of insurance domain knowledge (PnC or life or both) - Being responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL) - Overseeing and governing the expansion of existing data architecture and the optimization of data query performance via best practices - Working independently and collaboratively - Implementing business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning) - Working with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Defining and governing data modeling and design standards, tools, best practices, and related development for enterprise data models - Identifying the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC - Working proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills and attributes for success include: - Strong communication, presentation, and team-building skills - Experience in executing and managing research and analysis of companies and markets - BE/BTech/MCA/MBA with 8 - 12 years of industry experience with machine learning, visualization, data science, and related offerings - At least around 4-8 years of experience in BI and Analytics - Ability to do end-to-end data solutions from analysis, mapping, profiling, ETL architecture, and data modeling - Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality - Good experience using CA Erwin or other similar modeling tools - Strong knowledge of relational and dimensional data modeling concepts - Experience in data management analysis - Experience with unstructured data is an added advantage - Ability to effectively visualize and communicate analysis results - Experience with big data and cloud preferred - Experience, interest, and adaptability to working in an Agile delivery environment. Ideally, you'll also have: - Good exposure to any ETL tools - Good to have knowledge about P&C insurance - Must have led a team size of at least 4 members - Experience in Insurance and Banking domain - Prior client-facing skills, self-motivated, and collaborative. What we look for: A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from engaging colleagues - Opportunities to develop new skills and progress your career - The freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Cloud Architect with expertise in Azure and Snowflake, you will be responsible for designing and implementing secure, scalable, and highly available cloud-based solutions on AWS and Azure Cloud. Your role will involve utilizing your experience in Azure Databricks, ADF, Azure Synapse, PySpark, and Snowflake Services. Additionally, you will participate in pre-sales activities, including RFP and proposal writing. Your experience with integrating various data sources with Data Warehouse and Data Lake will be crucial for this role. You will also be expected to create Data warehouses and data lakes for Reporting, AI, and Machine Learning purposes, while having a solid understanding of data modelling and data architecture concepts. Collaboration with clients to comprehend their business requirements and translating them into technical solutions that leverage Snowflake and Azure cloud platforms will be a key aspect of your responsibilities. Furthermore, you will be required to clearly articulate the advantages and disadvantages of different technologies and platforms, as well as participate in Proposal and Capability presentations. Defining and implementing cloud governance and best practices, identifying and implementing automation opportunities for increased operational efficiency, and conducting knowledge sharing and training sessions to educate clients and internal teams on cloud technologies are additional duties associated with this role. Your expertise will play a vital role in ensuring the success of cloud projects and the satisfaction of clients.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

The role involves and covers the following support from a Business Analyst (BA) perspective. As a Business Analyst, you will be responsible for understanding data and data architecture, metrics, and their significance in the design and delivery of new dashboards. You should be able to identify and engage data owners. Your role will require you to translate business requirements into technical requirements and conduct process mapping, with experience in Risk Transformation. Additionally, you should possess knowledge of requirement gathering and BRD/FRD documentation. Relevant experience of controls is necessary to ensure the quality and completeness of the solution and its data. You should also have the ability to challenge current practices and enhance dashboard designs through innovative design and technology utilization. Collaboration with cross-line-of-business working groups to design dashboards and approve solutions will be part of your responsibilities. Demonstrable experience in designing dashboards to meet the needs of diverse user groups is essential, along with a solid understanding of risk management and knowledge of the latest dashboard technologies and trends. Other Desirable Experience: - Programme delivery on strategic programmes - Interaction with executive-level management - TOM development - Regulatory requirements mapping and gap analysis - Governance, Reporting, and MI/dashboarding delivery - Knowledge of Counterparty Credit Risk, Exposure calculation methodologies (simulation, aggregation, limit monitoring), and experience in implementing Modelled and Non-Modelled calculation algorithms - Previous experience in capturing and analyzing the daily movement of EAD numbers for Financing Products, calculating counterparty credit risk - Experience in validated counterparty exposure on a daily, monthly, and quarterly basis using various metrics including Exposure metrics (PFE, EPE, EEPE, EAD, etc.) and VAR computation using both Internal Model (IMM) and Standardized approaches like CEM - Hands-on Experience of Exposure Calculation (EAD/PFE) at Portfolio level for both Modeled (IMM) and Non-Modeled (CEM/SACCR, Credit VAR, CEF) transactions - Working knowledge of calculating and reporting default risk for traded products - Understanding of adjustments at the counterparty level where traded product exposure (derivatives, debt, and equity financing) was found to be erroneous and material to mitigate impact on risk monitoring, CVA, and RWA - Some exposure to credit risk reporting platforms and risk engine Skills and Qualifications: - CFA/FRM certification is a plus - Strong analytical skills and statistical background,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As the Senior Director - Enterprise Head of Architecture at AstraZeneca, you will be responsible for overseeing the architecture across the Enterprise Capabilities & Solutions (ECS) landscape. Your role will involve collaborating with other Heads of Architecture to align ECS architectures with AstraZeneca's business and IT strategies. You will play a key role in defining AZ standards, patterns, and roadmaps, ensuring that ECS architectures adhere to them. Additionally, you will partner with key business teams, IT Leadership, and other Heads of Architecture to shape AstraZeneca's Digital and Architectural landscape in line with business strategies and visions. Your main accountabilities will include providing Enterprise Design thinking and support to ECS, defining architecture strategies and standards for the ECS Segment, leading the development and execution of Data & Analytics specific architecture capability and strategy, managing a team of Architects across various technology domains, and driving the development of global enterprise standards for the central Architecture & Technology Strategy function. You will also be responsible for analyzing business data and AI priorities & strategies, refining architecture strategies and principles, and ensuring continuous alignment of capabilities with current business priorities and objectives. Furthermore, as a Senior Director, you will act as an authority to the architecture community and the business regarding strategic architecture decisions for ECS. You will lead the development of Architecture Roadmaps and Blueprints, contribute to multi-functional decision-making bodies, and champion ECS EA initiatives across IT and business. Your role will involve engaging with external architecture authorities to stay updated with the latest trends and technologies in ECS, acting as a strategic architectural advisor for senior IT and Business management, and ensuring that ECS Architectures align with Global strategies and solutions. To be successful in this role, you should have a Bachelor's degree or equivalent experience in Computer Science or Data Management related field, extensive experience and knowledge of ECS solutions, the ability to influence and deliver strategic vision, direction, and planning, a blend of data architecture, analysis, and engineering skills, experience in known industry architectural patterns, understanding of cloud-based containerization strategies for hybrid cloud environments, and knowledge of appropriate data structure and technology based on business use case. Desirable skills/experience include a post-graduate degree in MIS or Data Management, extensive experience in a data architect role, experience in Agile data definition scrums, knowledge of Cloud Economics and Forecasting, and experience using metadata cataloguing tools. Join AstraZeneca in our unique and daring world where we redefine our ability to develop life-changing medicines through a combination of brand new science and leading digital technology platforms. Be part of our digital and data-led enterprise journey where you can innovate, take ownership, explore new solutions, experiment with innovative technology, and address challenges in a modern technology environment. If you are ready to make a difference, apply now!,

Posted 1 week ago

Apply

18.0 - 22.0 years

0 Lacs

haryana

On-site

You are a highly motivated and experienced Enterprise Architect with a strong focus on Product Engineering, Product Development, and Cloud Native Product Architecture. You will play a critical role in shaping the technical vision and architecture of our product portfolio, ensuring alignment with business strategy and long-term scalability. Collaborating with engineering leads, product managers, engineers, and other stakeholders, you will define and evolve the product architecture roadmap, driving innovation and delivering exceptional customer value. Your major responsibilities will include defining and championing the overall product architecture vision, strategy, and roadmap, considering scalability, performance, security, maintainability, and cost-effectiveness. Leading the design and evolution of key product architectures, providing guidance and direction to development teams on architectural best practices and patterns will be a key aspect of your role. You should have expertise in well-architected patterns across logical and deployment views of the architecture and the ability to apply them appropriately to shape the solution. Strong depth in cloud engineering and full-stack cloud solutions is required, including front-end, back-end, APIs, microservices, workflows, and automation. Researching, evaluating, and recommending appropriate technologies, platforms, and frameworks to support product development and innovation is another important responsibility. You will collaborate effectively with product managers, engineers, business stakeholders, and other architects to ensure alignment on architectural decisions. Establishing and enforcing architectural principles, standards, and guidelines across the product development organization is crucial. You should have a good understanding of AI-driven workflows and automation, as well as data architecture, insights, reporting, and solution architecture. Your role will also involve identifying and addressing technical debt within the product architecture, developing strategies for its mitigation, and mentoring and coaching development teams on architectural best practices. Staying abreast of industry trends, emerging technologies, and the competitive landscape is essential to identify opportunities for innovation and improvement. Creating and maintaining architectural documentation, including high-level designs, architectural diagrams, and API specifications, will be part of your responsibilities. You should have deep understanding of enterprise architecture principles, frameworks, and their application in large-scale product development, as well as proven ability to define and evolve product architecture roadmaps aligning with business strategy and long-term scalability goals. Extensive experience in building and managing scalable cloud platforms, cloud engineering, cloud-native application development, API management and integration, DevOps, data platforms, AI-enabled automation, and modern full-stack development skills will be required for this role. Additionally, you should be able to evaluate and select appropriate technologies, manage technical debt, lead and mentor engineering teams, and propose pragmatic solutions. With a minimum of 18+ years of experience and credible exposure to Cloud Engineering, Cloud Native Apps and Platforms, and Enterprise Architecture, you should also have a minimum of 10+ years of architecture experience with 1 major Cloud Platform, preferably Azure Cloud. An architecture certification with 1 major Cloud Platform, especially Azure Cloud, is highly desirable. Experience within the Creative Production and Creative Technology domain, and a high-level understanding of creative processes, is also highly desirable. Location: DGS India - Pune - Kharadi EON Free Zone Brand: Dentsu Creative Time Type: Full time Contract Type: Permanent,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

As an experienced Data Architect with a focus on advanced analytics and Generative AI solutions, your role will involve architecting and delivering cutting-edge analytics and visualization solutions utilizing Databricks, Generative AI frameworks, and modern BI tools. You will be responsible for designing and implementing Generative AI solutions, integrating frameworks like Microsoft Copilot, and developing reference architectures for leveraging Databricks Agent capabilities. In this position, you will lead pre-sales engagements, conduct technical discovery sessions, and provide solution demos. You will collaborate with various stakeholders to align analytics solutions with business objectives and promote best practices for AI/BI Genie and Generative AI-driven visualization platforms. Additionally, you will guide the deployment of modern data architectures that integrate AI-driven decision support with popular BI tools such as Power BI, Tableau, or ThoughtSpot. Your role will also involve serving as a trusted advisor to clients, helping them transform their analytics and visualization strategies with Generative AI innovation. You will mentor and lead teams of consultants, ensuring high-quality solution delivery, reusable assets, and continuous skill development. Staying current on Databricks platform evolution, GenAI frameworks, and next-generation BI trends will be crucial for proactively advising clients on emerging innovations. To be successful in this role, you should have at least 8 years of experience in data analytics, data engineering, or BI architecture roles, with a minimum of 3 years of experience delivering advanced analytics and Generative AI solutions. Hands-on expertise with the Databricks platform, familiarity with Generative AI frameworks, and strong skills in visualization platforms are essential. Pre-sales experience, consulting skills, and knowledge of data governance and responsible AI principles are also required. Preferred qualifications include Databricks certifications, certifications in major cloud platforms, experience with GenAI prompt engineering, exposure to knowledge graphs and semantic search frameworks, industry experience in financial services, healthcare, or manufacturing, and familiarity with MLOps and end-to-end AI/ML pipelines. Your primary skills should include Data Architecture, with additional expertise in Power BI, AI/ML Architecture, Analytics Architecture, and BI & Visualization Development. Joining Infogain, a human-centered digital platform and software engineering company, will provide you with opportunities to work on cutting-edge projects for Fortune 500 companies and digital natives across various industries, utilizing technologies such as cloud, microservices, automation, IoT, and artificial intelligence.,

Posted 1 week ago

Apply

14.0 - 18.0 years

0 Lacs

maharashtra

On-site

You will be leading the architectural design for a migration project, utilizing Azure services, SQL, Databricks, and PySpark to develop scalable, efficient, and reliable solutions. Your responsibilities will include designing and implementing advanced data transformation and processing tasks using Databricks, PySpark, and ADF. You should have a strong understanding of data integration, ETL, and data warehousing concepts. It will be essential to design, deploy, and manage Databricks clusters for data processing, ensuring performance and cost efficiency. Troubleshooting cluster performance issues when necessary is also part of your role. You will mentor and guide developers on using PySpark for data transformation and analysis, sharing best practices and reusable code patterns. Having experience in end-to-end architecture for SAS to PySpark migration will be beneficial. Documenting architectural designs, migration plans, and best practices to ensure alignment and reusability within the team and across the organization is a key aspect of this position. You should be experienced in delivering end-to-end solutions and effectively managing project execution. Collaborating with stakeholders to translate business requirements into technical specifications and designing robust data pipelines, storage solutions, and transformation workflows will be part of your responsibilities. Supporting UAT and production deployment planning is also required. Strong communication and collaboration skills are essential for this role. Experience: 14-16 Years Skills: Primary Skill: Data Architecture Sub Skill(s): Data Architecture Additional Skill(s): ETL, Data Architecture, Databricks, PySpark About the Company: Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. They engineer business outcomes for Fortune 500 companies and digital natives in various industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. Infogain accelerates experience-led transformation in the delivery of digital platforms. The company is a Microsoft Gold Partner and Azure Expert Managed Services Provider. Infogain, an Apax Funds portfolio company, has offices in multiple locations worldwide.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Database Administrator at NTT DATA, you will play a crucial role in ensuring the availability, integrity, and performance of complex and critical data assets. Working closely with cross-functional teams, you will support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Your expertise will be instrumental in controlling access to database environments through permissions and privileges. Key Responsibilities: - Install, configure, and maintain complex database management systems (DBMS) such as Oracle, MySQL, PostgreSQL, and others. - Collaborate with software developers/architects to design and optimize database schemas and data models. - Write database documentation, data standards, data flow diagrams, and standard operating procedures. - Monitor database performance, identify bottlenecks, and optimize queries for optimal performance. - Design and implement backup and disaster recovery strategies for data availability and business continuity. - Work with Change Control and Release Management to commission new applications and customize existing ones. - Plan and execute database software upgrades and patches to ensure system security and up-to-date functionality. - Implement security measures to safeguard databases from unauthorized access, breaches, and data loss. - Conduct security audits and vulnerability assessments to maintain compliance with data protection standards. - Collaborate with cross-functional teams to support database-related initiatives and provide technical support to end-users. Knowledge, Skills, and Attributes: - Proficiency in database administration tasks, SQL, database security, backup and recovery strategies. - Ability to monitor database performance, manage multiple projects, and communicate complex IT information effectively. - Strong problem-solving and analytical skills to troubleshoot database-related issues. - Familiarity with data architecture, data services, and application development lifecycle. - Experience working with unstructured datasets and extracting value from large datasets. Academic Qualifications and Certifications: - Bachelor's degree in computer science, engineering, information technology, or related field. - Relevant certifications such as MCSE DBA, Oracle Certified Professional, MySQL Database Administrator, PostgreSQL Certified Professional. - Completion of database management courses covering database administration, data modeling, SQL, and performance tuning. Required Experience: - Demonstrated experience as a Database Administrator within an IT organization. - Experience with database backup and recovery practices, health assessment reports, and managing databases. Workplace Type: - Hybrid Working About NTT DATA: NTT DATA is a trusted global innovator of business and technology services, serving Fortune Global 100 clients. Committed to innovation and long-term success, NTT DATA invests in R&D to drive organizations confidently into the digital future. With a diverse global team and extensive partner ecosystem, NTT DATA offers consulting, AI, industry solutions, and application management services. As a leading provider of digital and AI infrastructure, NTT DATA is part of NTT Group and headquartered in Tokyo. NTT DATA is an Equal Opportunity Employer.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The AIML Architect-Dataflow, BigQuery position is a critical role within our organization, focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. You will combine advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that enhance decision-making processes across various departments. Your responsibilities will include building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in our data workflows. Collaboration with data engineers, data scientists, and application developers is essential to align with business goals and technical vision. You must possess a deep understanding of cloud-native architectures and be enthusiastic about leveraging cutting-edge technologies to drive innovation, efficiency, and insights from extensive datasets. You should have a robust background in data processing and AI/ML methodologies, capable of translating complex technical requirements into scalable solutions that meet the evolving needs of the organization. Key Responsibilities - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms for extracting insights from large datasets. - Optimize data storage and retrieval processes to improve performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Work closely with cross-functional teams to align data workflows with business objectives. - Conduct technical evaluations and assessments of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship and guidance to junior data engineers and analysts. - Stay updated on industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Required Qualifications - Bachelor's or Master's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, especially BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience in implementing machine learning solutions in cloud environments. - Solid programming skills in Python, Java, or Scala. - Expertise in SQL and other query optimization techniques. - Experience with big data workloads and distributed computing. - Familiarity with modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Proven track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous. Skills - Cloud Computing - SQL Proficiency - Dataflow - AIML - Scala - Data Governance - ETL Processes - Python - Machine Learning - Java - Google Cloud Platform - Data Architecture - Data Modeling - BigQuery - Data Engineering - Data Visualization Tools,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies