Jobs
Interviews

1806 Data Architecture Jobs - Page 48

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

4 - 8 Lacs

Chennai

Work from Office

Your Role Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications. 1. Applies scientific methods to analyse and solve software engineering problems. 2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance. 3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers. 4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities. 5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Your Profile 4+ years of experience in data architecture, data warehousing, and cloud data solutions. Minimum 3+ years of hands-on experience with End to end Snowflake implementation. Experience in developing data architecture and roadmap strategies with knowledge to establish data governance and quality frameworks within Snowflake Expertise or strong knowledge in Snowflake best practices, performance tuning, and query optimisation. Experience with cloud platforms like AWS or Azure and familiarity with Snowflakes integration with these environments. Strong knowledge in at least one cloud(AWS or Azure) is mandatory Solid understanding of SQL, Python, and scripting for data processing and analytics. Experience in leading teams and managing complex data migration projects. Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders. Knowledge on new Snowflake features,AI capabilities and industry trends to drive innovation and continuous improvement. Skills (competencies) Verbal Communication

Posted 2 months ago

Apply

9.0 - 14.0 years

10 - 15 Lacs

Hyderabad

Work from Office

The Product Owner III will be responsible for defining and prioritizing features and user stories, outlining acceptance criteria, and collaborating with cross-functional teams to ensure successful delivery of product increments. This role requires strong communication skills to effectively engage with stakeholders, gather requirements, and facilitate product demos. The ideal candidate should have a deep understanding of agile methodologies, experience in the insurance sector, and possess the ability to translate complex needs into actionable tasks for the development team. Key Responsibilities: Define and communicate the vision, roadmap, and backlog for data products. Manages team backlog items and prioritizes based on business value. Partners with the business owner to understand needs, manage scope and add/eliminate user stories while contributing heavy influence to build an effective strategy. Translate business requirements into scalable data product features. Collaborate with data engineers, analysts, and business stakeholders to prioritize and deliver impactful solutions. Champion data governance , privacy, and compliance best practices. Act as the voice of the customer to ensure usability and adoption of data products. Lead Agile ceremonies (e.g., backlog grooming, sprint planning, demos) and maintain a clear product backlog. Monitor data product performance and continuously identify areas for improvement. Support the integration of AI/ML solutions and advanced analytics into product offerings. Required Skills & Experience: Proven experience as a Product Owner, ideally in data or analytics domains. Strong understanding of data engineering , data architecture , and cloud platforms (AWS, Azure, GCP). Familiarity with SQL , data modeling, and modern data stack tools (e.g., Snowflake, dbt, Airflow). Excellent stakeholder management and communication skills across technical and non-technical teams. Strong business acumen and ability to align data products with strategic goals. Experience with Agile/Scrum methodologies and working in cross-functional teams. Ability to translate data insights into compelling stories and recommendations .

Posted 2 months ago

Apply

2.0 - 7.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.

Posted 2 months ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Mumbai

Work from Office

As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Configure Datastax Cassandra as per requirement of the project solution Design the database system specific to Cassandra in consultation with the data modelers, data architects and etl specialists as well as the microservices/ functional specialists. Thereby produce an effective database system in Cassandra according to the solution & client's needs and specifications. Interface with functional & data teams to ensure the integrations with other functional and data systems are working correctly and as designed. Participate in responsible or supporting roles in different tests or UAT that involve the DataStax Cassandra database. The role will also need to ensure that the Cassandra database is performing and error free. This will involve troubleshooting errors and performance issues and resolution of the same as well as plan for further database improvement. Ensure the database documentation & operation manual is up to date and usable Preferred technical and professional experience Has expertise, experience and deep knowledge in the configuration, design, troubleshooting of NoSQL server software and related products on Cloud, specifically DataStax Cassandra. Has knowledge/ experience in other NoSQl/ Cloud database. Installs, configures and upgrades RDBMS or NoSQL server software and related products on Cloud

Posted 2 months ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

Kochi

Work from Office

Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Databricks including Spark-based ETL, Delta Lake Good to have skills:Pyspark Job Summary We are seeking a highly skilled and experienced Senior Data Engineer to join our growing Data and Analytics team. The ideal candidate will have deep expertise in Databricks and cloud data warehousing, with a proven track record of designing and building scalable data pipelines, optimizing data architectures, and enabling robust analytics capabilities. This role involves working collaboratively with cross-functional teams to ensure the organization leverages data as a strategic asset. Your responsibilities will include: Roles and Responsibilities Design, build, and maintain scalable data pipelines and ETL processes using Databricks and other modern tools. Architect, implement, and manage cloud-based data warehousing solutions on Databricks (Lakehouse Architecture) Develop and maintain optimized data lake architectures to support advanced analytics and machine learning use cases. Collaborate with stakeholders to gather requirements, design solutions, and ensure high-quality data delivery. Optimize data pipelines for performance and cost efficiency. Implement and enforce best practices for data governance, access control, security, and compliance in the cloud. Monitor and troubleshoot data pipelines to ensure reliability and accuracy. Lead and mentor junior engineers, fostering a culture of continuous learning and innovation. Excellent communication skills Ability to work independently and along with client based out of western Europe. Professional and Technical Skills 3.5-5 years of experience in Data Engineering roles with a focus on cloud platforms. Proficiency in Databricks, including Spark-based ETL, Delta Lake, and SQL. Strong experience with one or more cloud platforms (AWS preferred). Handson Experience with Delta lake, Unity Catalog, and Lakehouse architecture concepts. Strong programming skills in Python and SQL; experience with Pyspark a plus. Solid understanding of data modeling concepts and practices (e.g., star schema, dimensional modeling). Knowledge of CI/CD practices and version control systems (e.g., Git). Familiarity with data governance and security practices, including GDPR and CCPA compliance. Additional Information Experience with Airflow or similar workflow orchestration tools. Exposure to machine learning workflows and MLOps. Certification in Databricks, AWS Familiarity with data visualization tools such as Power BI (do not remove the hyperlink)Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 2 months ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support data initiatives.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data integration techniques and ETL processes.- Experience with data profiling and data cleansing methodologies.- Familiarity with database management systems and SQL.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 3 years of experience in Informatica Data Quality.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data architecture and collaborating with cross-functional teams to optimize data processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead data modeling initiatives to design and implement data structures.- Optimize data storage and retrieval processes.- Develop and maintain data pipelines for efficient data flow. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database management systems.- Experience with data warehousing and ETL processes.- Knowledge of data governance and compliance.- Hands-on experience with data visualization tools. Additional Information:- The candidate should have a minimum of 12 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Bengaluru office.PFB Education details- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Data Architect & Modeler Project Role Description Data Model, Design, build and lead the complex ETL data integration pipelines to meet business process and application requirements. Management Level :9Work Experience :6+ yearsWork Location :AnyMust have skills :Data Architecture Principles Good to have skills :Data Modeling, Data Architect, Informatica PowerCenter, Informatica Data Quality, SAP BusinessObjects Data Services, SQL, PL/SQL, SAP HANA DB, MS Azure, Python, ErWin, SAP Power Designer Job :Data Architect, Modeler, and data Integration LeadKey Responsibilities:1) Working on building Data models, Forward and Reverse Engineering.2) Working on Data and design analysis and working with data analysts team on data model design.3) Working on presentations on design, end to end flow and data models.4) Work on new and existing data models using Power designer tools and other designing tools like Visio5) Work with functional SMEs, BAs to review requirements, mapping documents Technical Experience:1) Should have good understanding of ETL design concepts like CDC, SCD, Transpose/ pivot, Updates, Validation2) Should have strong understanding of SQL concepts, Data warehouse concepts and can easily understand data technically and functionally.3) Good understanding of various file formats like xml, delimited, fixed width etc.4) Understand the concepts of data quality, data cleansing, data profiling5) Good to have Python and other new data technologies and cloud exposure.6) Having Insurance background is a plus. Educational Qualification :15 years of fulltime education with BE/B Tech or equivalent Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of relational and non-relational database design principles.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality frameworks.- Ability to translate business requirements into technical specifications. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Building Tool Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and organizational standards, facilitating smooth data integration and accessibility across different systems. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions and workshops to enhance team capabilities.- Continuously evaluate and improve data modeling processes to ensure efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data modeling techniques and methodologies.- Experience with data integration and ETL processes.- Familiarity with database management systems and SQL.- Ability to translate business requirements into technical specifications. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Building Tool.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Kolkata

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Building Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and organizational standards, facilitating smooth data integration and accessibility across different systems. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data modeling techniques and methodologies.- Experience with data integration and ETL processes.- Familiarity with database management systems and SQL.- Ability to translate business requirements into technical specifications. Additional Information:- The candidate should have minimum 12 years of experience in Data Building Tool.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on various data-related tasks and collaborating with teams to optimize data processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Develop innovative data solutions to meet business requirements.- Optimize data pipelines for efficiency and scalability.- Implement data governance policies to ensure data quality and security. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles.- Strong understanding of data modeling and database design.- Experience with ETL tools and processes.- Knowledge of cloud platforms and big data technologies.- Good To Have Skills: Data management and governance expertise. Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles.- This position is based at our Bengaluru office.Education information - - A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

8.0 - 10.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Job Title - S&CGN - Tech Strategy & Advisory - SAP S/4 MDG - Consultant Management Level: 9-Team Lead/Consultant Location: Bengaluru, BDC7A Must-have skills: Data Architecture Good to have skills: Knowledge of emerging technologies, cloud computing, and cybersecurity best practices. Job Summary : Strong functional understanding and hands-on experience of SAP MDM / MDG backed up with implementation projects and aligned with SAP MM, SD, FICO, PP processes Responsible for process design, configuration, assist with testing, gather requirements and ultimately setup a full functional development for MDG-S, MDG-C and MDG-MM, test, and production environment to deliver MDG objects and integration solutions Ability to work on customized SAP environment and integrated non-SAP interfaces. Ability to understand customer demands, challenge requirements in terms of business value and effort / complexity & translate them into solutions. Adept in developing, delivering and supporting analytic solutions based on business requirements. Having understanding of Analytical data modelling, knowledge on data models - attribute, analytical and calculation will be appreciated. Good Knowledge of SAP Services Business Processes knowledge and understanding of DataWarehouseCloud experience Excellent problem-solving skills & deep understanding of processes from business & functional perspective. Excellent Verbal & Written Communication Skills and Proficiency in MS Office applications Ability to work with clients & teams from multiple geographies Roles & Responsibilities: Develop and execute technology transformation strategies, oversee implementation projects, and optimize digital capabilities for business efficiency. Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 8-10Years Educational Qualification: Any Degree

Posted 2 months ago

Apply

5.0 - 10.0 years

6 - 7 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Position: Data Engineer - MS Fabric Purpose of the Position: As an MS Fabric Data engineer you will be responsible for designing, implementing, and managing scalable data pipelines. Strong experience in implementation and management of lake House using MS Fabric Azure Tech stack (ADLS Gen2, ADF, Azure SQL) . Proficiency in data integration techniques, ETL processes and data pipeline architectures. Well versed in Data Quality rules, principles and implementation. Location: Bangalore/ Pune/ Nagpur/ Chennai Type of Employment: FTE Key Result Areas and Activities: 1. Data Pipeline Development Optimization Design and implement data pipelines using MS Fabric. Manage and optimize ETL processes for data extraction, transformation, and loading. Conduct performance tuning for data storage and retrieval to enhance efficiency. 2. Data Quality, Governance Documentation Ensure data quality and integrity across all data processes. Assist in designing data governance frameworks and policies. Generate and maintain documentation for data architecture and data flows. 3. Cross-Functional Collaboration Requirement Gathering Collaborate with cross-functional teams to gather and define data requirements. Translate functional and non-functional requirements into system specifications. 4. Technical Leadership Support Provide technical guidance and support to junior data engineers. Participate in code reviews and ensure adherence to coding standards. Troubleshoot data-related issues and implement effective solutions. Technical Experience: Must Have: Proficient in MS Fabric, Azure Data Factory, and Azure Synapse Analytics with deep knowledge of Fabric components like writing Notebook, Lakehouses, OneLake, Data Pipelines, and Real-Time Analytics. Skilled in integrating Fabric capabilities for seamless data flow, governance, and cross-team collaboration. Strong grasp of Delta Lake, Parquet, distributed data systems, and various data formats (JSON, XML, CSV, Parquet). Experienced in ETL/ELT processes, data warehousing, data modeling, and data quality frameworks. Proficient in Python, PySpark, Scala, Spark SQL, and T-SQL for complex data transformations. Familiar with Agile methodologies and tools like JIRA, with hands-on experience in monitoring tools and job scheduling. Good To Have: Familiarity with Azure cloud platforms and cloud data services MS Purview, Open Source libraries like Dequee, Pydequee, Great Expectation for DQ implementation Develop data models to support business intelligence and analytics Experience with PowerBI dashboard Experience with Databricks Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or a related field 5+ years of experience in MS Fabric/ADF/Synapse Qualities: Experience with or knowledge of Agile Software Development methodologies . Able to consult, write, and present persuasively.

Posted 2 months ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Data Architect- Total Yrs. of Experience* 15+ Relevant Yrs. of experience* 8+ Detailed JD *(Roles and Responsibilities) Leadership qualities and ability to lead a team of 8 data engineers + PowerBI resources Should be able to engage with business users and IT to provide consultation on data and visualization needs Excellent communication, articulation, and presentation skills Exposure to data architecture, ETL architecture Design, develop, and maintain scalable data pipelines using Python, ADF, and Databricks Implement ETL process to extract, transform, and load data from various sources into Snowflake Ensure data is processed efficiently and is made available for analytics and reporting 8+ years of experience in data engineering, with a focus on Python, ADF, Snowflake, Databricks, and ETL processes. Proficiency in SQL and experience with cloud-based data storage and processing. Strong problem-solving skills and the ability to work in a fast-paced environment Experience with Agile methodologies and working in a collaborative team environment. Certification in Snowflake, Azure, or other relevant technologies is an added advantage Bachelors degree in computer science engineering, Information Systems or equivalent field Mandatory skills* Python, Snowflake, Azure Data Factory, Databricks, SQL Desired skills* 1. Strong Oral and written communication 2. Proactive and accountable of the deliverables quality and timely submission Domain* Retail Work Location* India Location- PAN India Yrs of Exp-15+Yrs

Posted 2 months ago

Apply

12.0 - 17.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Key Result Areas: Architect modern DA solutions using best of breed cloud services specifically from GCP aligned to client needs and drive implementation for successful delivery Demonstrate expertise for client success through delivery support and thought leadership cloud data architecture with focus on GCP data analytics services. Contribute to business growth through presales support for GCP based solutions Research experiment to address unmet needs through innovation Build reuse knowledge, expertise foundational components for cloud data architecture, data engineering specifically on GCP Grow nurture technical talent within the Infocepts GCP community of practice Must-Have Skills Deep hands-on experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer. Proven ability to design enterprise-grade data lakes, warehouses, and real-time data systems. Strong command of Python and SQL for data engineering and automation tasks. Expertise in building and managing complex ETL/ELT pipelines using tools like Apache Beam or Airflow. Experience in leading teams, conducting code reviews, and engaging with senior stakeholders. Good-to-Have Skills Familiarity with Terraform or Deployment Manager for GCP resource provisioning. Experience with Kafka, Apache Beam, or similar technologies. Knowledge of data lineage, cataloging, encryption, and compliance frameworks (e.g., GDPR, HIPAA). Exposure to integrating data pipelines with ML models and Vertex AI. Understanding of Looker, Tableau, or Power BI for data consumption. Qualifications: Overall work experience of 12+ years with minimum of 3 to 6 years experience GCP related projects BS Degree in IT, MIS or business-related functional discipline Experience with or knowledge of Agile Software Development methodologies

Posted 2 months ago

Apply

8.0 - 12.0 years

13 - 17 Lacs

Ahmedabad

Work from Office

DataArchitecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. DataModeling:Createand managelogical, physical, and conceptual data models to support various business applications and analytics. DatabaseDesign: Design and implement database solutions, including data warehouses, data lakes, and operational databases. DataIntegration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. DataGovernance:Implementand enforce data governance policies and procedures to ensure data quality, consistency, and security. TechnologyEvaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the world s leading brands Documentation:Createand maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. PerformanceTuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements: Helpingproject teams withsolutions architecture,troubleshooting, and technical implementation assistance. Experiencewithbig data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertisewithcloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledgeofdataintegration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understandingofdatawarehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experiencewithdata governanceframeworks and tools.

Posted 2 months ago

Apply

12.0 - 16.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Job Title: Principal Architect Job Location: Hyd/Mohali/Kochi Mode of Work: Hybrid About the Role Were looking for a seasoned Principal Architect who thrives at the intersection of client innovation and organizational growth. Youll design transformational enterprise solutions for our clients while simultaneously driving Softobizs technical evolution, ensuring that every client engagement strengthens our capabilities and every internal innovation enhances our client value. This role demands someone who can seamlessly move between architecting solutions for enterprise clients and shaping the technology roadmap that keeps Softobiz at the industry forefront. Your expertise will directly influence both our clients competitive advantage and our own market position. What Youll Do Solution Architecture & Innovation Design and deliver enterprise-grade solutions that solve complex business challenges while continuously identifying opportunities to enhance Softobizs service offerings and technical capabilities. Lead architectural discussions with client executives and internal leadership teams, ensuring that every solution we build advances both client success and our organizational learning. Technical Strategy & Roadmap Execution Develop and execute comprehensive technology strategies that serve dual purposes - solving immediate client needs while building reusable capabilities that strengthen our competitive position. Lead the implementation of Softobizs technical roadmap by driving adoption of new technologies, establishing development standards, and ensuring successful delivery of strategic initiatives. Evaluate emerging technologies through the lens of both client value and internal advancement, then champion their adoption across the organization. Client Engagement & Internal Leadership Serve as the primary technical advisor for enterprise clients while mentoring internal teams on advanced architectural patterns and emerging technologies. Your client-facing experience will directly inform our internal training programs, standards development, and capability building initiatives. Thought Leadership & Market Intelligence Build Softobizs reputation as a technology innovator by translating insights from client engagements into thought leadership content, internal best practices, and new service offerings. Your deep understanding of enterprise challenges will guide both our marketing strategy and our research and development priorities. Required Expertise Enterprise Architecture Experience 12+ years designing large-scale enterprise solutions with demonstrated success in client-facing roles. You should have a track record of leading complex transformations that deliver measurable business value while building lasting client relationships that drive organizational growth. Cloud & Infrastructure Mastery Deep expertise across major cloud platforms with experience designing hybrid and multi-cloud strategies that balance client requirements with platform optimization. Your cloud architecture decisions should reflect both immediate solution needs and long-term scalability considerations that benefit future engagements. Distributed Systems & Integration Extensive experience with microservices architectures, distributed systems design, and enterprise integration patterns. Youll need to solve complex integration challenges for clients while establishing reusable patterns and frameworks that accelerate future project delivery. Data & AI Integration Proven ability to architect modern data platforms and integrate AI/ML capabilities into enterprise applications. Your experience should span both traditional data warehouse modernization and cutting-edge AI implementations, with an understanding of how these capabilities can be packaged into repeatable service offerings. Technology Leadership & Execution Experience leading technical teams and driving successful implementation of architectural standards in both client and internal contexts. You should be comfortable presenting to C-level executives while also leading change management initiatives, establishing development processes, and ensuring successful adoption of new technologies across teams within our organization. The Impact Youll Create Your architectural decisions will directly influence enterprise client outcomes while simultaneously building Softobizs reputation as the premier technology partner for enterprise transformation. Every solution you design becomes a case study that enhances our market position, and every internal process you improve strengthens our ability to deliver exceptional client value. Youll work with cutting-edge technologies on high-stakes projects while helping shape the future direction of both client enterprises and Softobiz itself. Your expertise will drive revenue growth through exceptional client delivery and operational excellence through continuous organizational improvement. What Were Looking For Strategic Thinking - Ability to see beyond immediate project requirements to identify opportunities for long-term value creation, both for clients and for Softobiz. Technical Excellence - Deep expertise across modern technology stacks with a proven ability to make architectural decisions that scale and evolve with changing business needs. Client Focus - Experience building trust with enterprise stakeholders and translating complex technical concepts into clear business value propositions. Innovation Mindset - Passion for emerging technologies coupled with the judgment to evaluate their practical application in enterprise contexts. Execution Excellence - Proven track record of not just designing strategies but successfully implementing them, with experience leading cross-functional teams through complex technology transformations and organizational change initiatives. Qualifications Bachelors degree in Computer Science or related field; advanced degree preferred 12+ years of enterprise architecture experience with at least 5 years in client-facing roles Deep expertise in cloud platforms, distributed systems, microservices, and enterprise integration Experience with data architecture, AI/ML integration, and modern development practices Strong communication skills with experience presenting to executive audiences Industry certifications in relevant technology platforms Why Join Softobiz Youll have the opportunity to work with industry-leading clients while helping build the technology organization that will define the next generation of enterprise solutions. Your expertise will directly impact both immediate client success and long-term organizational growth, creating a unique career experience that combines the excitement of cutting-edge client work with the satisfaction of building something lasting. We offer competitive compensation, comprehensive benefits, flexible work arrangements, and significant opportunities for professional growth and industry recognition.

Posted 2 months ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Cloud Data Architecture Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Solutions Architect - Lead, you will analyze, design, code, and test multiple components of application code. You will perform maintenance, enhancements, and/or development work, contributing to the overall success of the projects. Roles & Responsibilities:Design and develop the overall architecture of our digital data platform using AWS services.Create and maintain cloud infrastructure designs and architectural diagrams. Collaborate with stakeholders to understand business requirements and translate them into scalable AWS-based solutions. Evaluate and recommend AWS technologies, services, and tools for the platform. Ensure the scalability, performance, security, and cost-effectiveness of the AWS-based platform. Lead and mentor the technical team in implementing architectural decisions and AWS best practices. Develop and maintain architectural documentation and standards for AWS implementations. Stay current with emerging AWS technologies, services, and industry trends. Optimize existing AWS infrastructure for performance and cost. Implement and manage disaster recovery and business continuity plans. Professional & Technical Skills: Minimum 8 years of experience in IT architecture, with at least 5 years in a solutions architect role. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).Experience in Infrastructure as Code (e.g., CloudFormation, Terraform). Exposure to Continuous Integration/Continuous Deployment (CI/CD) pipelines. Experience in Containerization technologies (e.g., Docker, Kubernetes).Proficiency in multiple programming languages and frameworks. AWS Certified Solutions Architect - Professional certification required. Additional Information:The candidate should have a minimum of 5 years of experience in solutions architect role.This position is based at our Hyderabad office.A 15 years full time education is required (Bachelor of Engineering in Electronics/Computer Science, or any related stream). Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

13 - 17 Lacs

Pune

Work from Office

Alexa+ is our next-generation assistant powered by generative AI. Alexa+ is more conversational, smarter, personalized, and gets things done. Our goal is make Alexa+ an instantly familiar personal assistant that is always ready to help or entertain on any device. At the core of this vision is Alexa AI Developer Tech, a close-knit team that s dedicated to providing software developers with the tools, primitives, and services they need to easily create engaging customer experiences that expand the wealth of information, products and services available on Alexa+. You will join a growing organization working on top technology using Generative AI and have an enormous opportunity to make an impact on the design, architecture, and implementation of products used every day, by people you know. We re working hard, having fun, and making history; come join us! Work with a team of product and program managers, engineering leaders, and business leaders to build data architectures and platforms to support business Design, develop, and operate high-scalable, high-performance, low-cost, and accurate data pipelines in distributed data processing platforms Recognize and adopt best practices in data processing, reporting, and analysis: data integrity, test design, analysis, validation, and documentation Keep up to date with big data technologies, evaluate and make decisions around the use of new or existing software products to design the data architecture Design, build and own all the components of a high-volume data warehouse end to end. Provide end-to-end data engineering support for project lifecycle execution (design, execution and risk assessment) Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources Own the functional and nonfunctional scaling of software systems in your ownership area. *Implement big data solutions for distributed computing. About the team Alexa AI Developer Tech is an organization within Alexa on a mission to empower developers to create delightful and engaging experiences by making Alexa more natural, accurate, conversational, and personalized. 3+ years of data engineering experience 4+ years of SQL experience Experience with data modeling, warehousing and building ETL pipelines Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)

Posted 2 months ago

Apply

2.0 - 3.0 years

6 - 11 Lacs

Bengaluru

Work from Office

*Please note this job is not for 2070 Health.* About Wysa Wysa is the worlds most advanced AI-based digital companion for behavioral health. We are the global tech leader in the mental health space. We are trusted by employers, payors and healthcare providers and government agencies because we are able to provide a scalable and low cost solution. Being an AI-based solution, Wysa overcomes the stigma and privacy concerns that often restrict people seeking help for their mental wellbeing. Wysas Mission Wysa is on a mission to help 50 million people with their mental wellbeing by the next decade. We are working with our partners all over the globe to bring high-quality digital mental wellbeing solutions to the ones in need. Currently we have reached 6.5 million people and have saved 450 lives. Our Tech Landscape From 10000ft above The Wysa AI Digital companion is our core product, around which we have a huge suite of tools, products, and programs. We use the MERN stack heavily, and python for our ML and NLP and are constantly iterating on our use of generative AI across business and end user products as well as internal devX. We also use a plethora of other technologies based on the use case, e.g. s3-athena-glue-quicksight for our business insights dashboards. Exclusively on the cloud, our servers are primarily on AWS, fronted by cloudflare. A few numbers about scale in our various user segments B2C - A Total userbase of 6Million+, spread across 100+ countries B2B and Enterprise Clients - 70+ clients globally B2G (Business to govt.), where Wysa is part of govt. initiatives or is offered as a service to Govt. employees - this is currently active in India, UK, and Singapore) We build for data residency when we work with governments and organisations in regions like India, EU, and US, UK, while at the same time fulfilling our SLAs and promise of 99.9% uptime worldwide! We take pride in 1. Privacy and security by design 2. Heavily optimising our cloud and infra costs 3. Constantly innovating and leading the industry in using GenAI safely 4. Constantly evolving the way we reach our users, including building for voice, outreach over whatsapp etc. Qualifications Has a total of 6+ Years of experience with at least 2-3 years in leadership role, leading a team of 4+ developers You have a knack for solving problems involving providing the service to users at scale while ensuring 1

Posted 2 months ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Chennai

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects.- Ensure cohesive integration between systems and data models.- Implement data platform components.- Troubleshoot and resolve data platform issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

12.0 - 15.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Collibra Data Governance Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Monitor and evaluate team performance, providing constructive feedback to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Collibra Data Governance.- Strong understanding of data governance frameworks and best practices.- Experience with data integration tools and techniques.- Familiarity with data modeling concepts and methodologies.- Ability to analyze and interpret complex data sets to inform decision-making. Additional Information:- The candidate should have minimum 12 years of experience in Collibra Data Governance.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

4.0 - 8.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Job Summary: We are looking for a highly skilled AWS Data Architect to design and implement scalable, secure, and high-performing data architecture solutions on AWS. The ideal candidate will have hands-on experience in building data lakes, data warehouses, and data pipelines, along with a solid understanding of data governance and cloud security best practices. Roles and Responsibilities: Design and implement data architecture solutions on AWS using services such as S3, Redshift, Glue, Lake Formation, Athena, and Lambda. Develop scalable ETL/ELT workflows and data pipelines using AWS Glue, Apache Spark, or AWS Data Pipeline. Define and implement data governance, security, and compliance strategies, including IAM policies, encryption, and data cataloging. Create and manage data lakes and data warehouses that are scalable, cost-effective, and secure. Collaborate with data engineers, analysts, and business stakeholders to develop robust data models and reporting solutions. Evaluate and recommend tools, technologies, and best practices to optimize data architecture and ensure high-quality solutions. Ensure data quality, performance tuning, and optimization for large-scale data storage and processing Required Skills and Qualifications: Proven experience in AWS data services such as S3, Redshift, Glue, etc. Strong knowledge of data modeling, data warehousing, and big data architecture. Hands-on experience with ETL/ELT tools and data pipeline frameworks. Good understanding of data security and compliance in cloud environments. Excellent problem-solving skills and ability to work collaboratively with cross-functional teams. Strong verbal and written communication skills. Preferred Skills: AWS Certified Data Analytics – Specialty or AWS Solutions Architect Certification. Experience in performance tuning and optimizing large datasets.

Posted 2 months ago

Apply

6.0 - 12.0 years

13 - 17 Lacs

Chennai

Work from Office

Are you a visionary who thrives on designing future-ready data ecosystems? Let s build the next big thing together! Were working with top retail and healthcare leaders to transform how they harness data and we re looking for a Data Architect to guide that journey. We are looking for an experienced Data Architect with deep knowledge of Databricks and cloud-native data architecture. This role will drive the design and implementation of scalable, high-performance data platforms to support advanced analytics, business intelligence, and data science initiatives within a retail or healthcare environment. Key Responsibilities: Define and implement enterprise-level data architecture strategies using Databricks. Design end-to-end data ecosystems including ingestion, transformation, storage, and access layers. Lead data governance, data quality, and security initiatives across the organization. Work with stakeholders to align data architecture with business goals and compliance requirements. Guide the engineering team on best practices in data modeling, pipeline development, and system optimization. Champion the use of Delta Lake, Lakehouse architecture, and real-time analytics. Required Qualifications: 8+ years of experience in data architecture or solution architecture roles. Strong expertise in Databricks, Spark, Delta Lake, and data warehousing concepts. Solid understanding of modern data platform tools (Snowflake, Azure Synapse, BigQuery, etc.). Experience with cloud architecture (Azure preferred), data governance, and MDM. Strong understanding of healthcare or retail data workflows and regulatory requirements. Excellent communication and stakeholder management skills. Benefits: Health Insurance, Accident Insurance. The salary will be determined based on several factors including, but not limited to, location, relevant education, qualifications, experience, technical skills, and business needs. Additional Responsibilities: Participate in OP monthly team meetings, and participate in team-building efforts. Contribute to OP technical discussions, peer reviews, etc. Contribute content and collaborate via the OP-Wiki/Knowledge Base. Provide status reports to OP Account Management as requested. About us: OP is a technology consulting and solutions company, offering advisory and managed services, innovative platforms, and staffing solutions across a wide range of fields including AI, cyber security, enterprise architecture, and beyond. Our most valuable asset is our people: dynamic, creative thinkers, who are passionate about doing quality work. As a member of the OP team, you will have access to industry-leading consulting practices, strategies & and technologies, innovative training & education. An ideal OP team member is a technology leader with a proven track record of technical excellence and a strong focus on process and methodology.

Posted 2 months ago

Apply

14.0 - 17.0 years

12 - 17 Lacs

Pune

Work from Office

Experience required 10-15 Position Fulltime Mode Hybrid . Need guy from Pune as location of work is Pashan Pune . Need utmost technical person . Role would be 40 percent managerial 60 percent technical . Roles and responsibility : Collaborate with internal teams to produce software design an architecture Write clean, scalable code using .NET programming languages (.net core and framework) Prepare and maintain code for various .Net applications and resolve any defects in systems. Revise, update, refactor, and debug code Improve existing software Develop documentation throughout the software development Monitor everyday activities of the system and Serve as an expert on applications and provide technical support. Preference Experience 8 + yrs Excellent communication skills Ability for critical thinking & creativity Having a systematic and logical approach to problem-solving, team working skills Provide expert advice to project teams on the use of integration technology, data architecture, modelling, and system architecture including integration best practices. Communicate project status to various levels of management. Manage an Integration/Architecture Roadmap and project backlog in partnership with the R&D leadership team, prioritize initiatives in line with business goals, and drive design and deployment of integration solutions that enable scalability, high availability, and re-use. Also needs hands experience for .NET/Java Language AI/ Azure Open AI knowledge is a big plus Requirement Good expertise in the MS entity framework/Dapper Proven experience as a .NET Developer Familiarity with the .NET framework, SQL Server & design/architectural patterns Model-View-Controller (MVC)) Familiarity with working of asp dot net core application Knowledge of at least one of the .NET languages (e.g. C# ..) Familiarity with architecture styles/APIs (REST, RPC) Experience with alerting mechanisms for API s in case of any failures. Understanding of Agile methodologies Good troubleshooting and communication skills Experience with concurrent development source control (Git)

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies