Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
25 - 37 Lacs
Bengaluru
Work from Office
100% Remote Snowflake / SQL Architect • Architect and manage scalable data solutions using Snowflake and advanced SQL, optimizing performance for analytics and reporting. • Design and implement data pipelines, data warehouses, and data lakes, ensuring efficient data ingestion and transformation. • Develop best practices for data security, access control, and compliance within cloud-based data environments. • Collaborate with cross-functional teams to understand business needs and translate them into robust data architectures. • Evaluate and integrate third-party tools and technologies to enhance the Snowflake ecosystem and overall data strategy.
Posted 3 weeks ago
8.0 - 10.0 years
13 - 18 Lacs
Chennai
Work from Office
Core Qualifications 12+ years in software/data architecture with hands on experience. Agentic AI & AWS Bedrock (MustHave): Demonstrated handson design, deployment, and operational experience with Agentic AI solutions leveraging AWS Bedrock and AWS Bedrock Agents . Deep expertise in cloud-native architectures on AWS (compute, storage, networking, security). Proven track record defining technology stacks across microservices, event streaming, and modern data platforms (e.g., Snowflake, Databricks). Proficiency with CI/CD and IaC (Azure DevOps, Terraform). Strong knowledge of data modeling, API design (REST/GraphQL), and integration patterns (ETL/ELT, CDC, messaging). Excellent communication and stakeholder-management skillsable to translate complex tech into business value. Preferred Media or broadcasting industry experience. Familiarity with Salesforce, or other enterprise iPaaS solutions. Certifications: AWS/Azure/GCP Architect , Salesforce Integration Architect , TOGAF . Mandatory Skills: Generative AI.
Posted 3 weeks ago
8.0 - 10.0 years
17 - 22 Lacs
Bengaluru
Work from Office
The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Dataiku. Experience: 8-10 Years.
Posted 3 weeks ago
8.0 - 10.0 years
17 - 22 Lacs
Bengaluru
Work from Office
The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience: 8-10 Years.
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
Siemens Energy is looking for a talented Senior MLOps Engineer to join the Digital Core team and contribute to shaping the data architecture and strategy within the organization. In this role, you will collaborate closely with stakeholders to drive developments in the Machine Learning environment. Your responsibilities will include partnering with business stakeholders, defining backlog priorities, consulting on AI ML solutions, supporting test automation, building CI CD pipelines, and working on PoCs/MVPs using various hyperscale offerings. The Siemens Energy Data Analytics & AI team is at the forefront of driving the energy transition and addressing environmental challenges. As part of this team, you will have the opportunity to work on innovative projects that reimagine the future and contribute to a sustainable world. Your impact will involve onboarding new AI ML use cases in AWS/Google Cloud Platform, defining MLOps architecture, deploying models, working with AI ML services like AWS Sagemaker and GCP AutoML, developing PoCs/MVPs, implementing CI CD pipelines, writing Infrastructure as code using AWS CDK scripts, providing consultancy on AI ML solutions, and supporting test automation and code deployment. To excel in this role, you should have a Bachelor's degree in Computer Science, Mathematics, Engineering, Physics, or related field (a Master's degree is a plus), AWS or GCP Certifications in ML AI, around 10 years of hands-on experience in ML/AI development and operations, expertise in ML Life Cycle and MLOps, proficiency in Python coding and Linux administration, experience with CI CD pipelines, and familiarity with JIRA, Confluence, and Agile delivery model. Additionally, you should possess excellent interpersonal, communication, and collaborative skills, be result-driven, and fluent in both spoken and written English. As part of the Data Platforms and Services organization at Siemens Energy, you will contribute to the company's mission of becoming a data-driven organization and supporting customers in transitioning to a more sustainable world through innovative technologies. Siemens Energy is a global energy technology company committed to providing sustainable, reliable, and affordable energy solutions while protecting the climate. With a diverse team of over 92,000 employees across 90+ countries, Siemens Energy is focused on decarbonization, new technologies, and energy transformation. Siemens Energy values diversity and inclusion, celebrating the unique contributions of individuals from over 130 nationalities. The company is dedicated to providing equal opportunities and encourages applications from individuals with disabilities. Join Siemens Energy in energizing society and driving positive change in the energy sector.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The AIML Architect-Dataflow, BigQuery plays a crucial role within the organization by focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. Your primary responsibility will involve combining advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that improve decision-making processes across various departments. Building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in data workflows will be key. Collaboration with data engineers, data scientists, and application developers is essential to align technical vision with business goals. Your expertise in cloud-native architectures will be instrumental in driving innovation, efficiency, and insights from vast datasets. The ideal candidate will have a strong background in data processing and AI/ML methodologies and be adept at translating complex technical requirements into scalable solutions that meet the organization's evolving needs. Responsibilities: - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms to extract insights from large datasets. - Optimize data storage and retrieval processes to enhance performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Align data workflows with business objectives through collaboration with cross-functional teams. - Conduct technical evaluations of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship to junior data engineers and analysts. - Stay updated with industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Requirements: - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, specifically BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience implementing machine learning solutions in cloud environments. - Proficient in programming languages like Python, Java, or Scala. - Expertise in SQL and query optimization techniques. - Familiarity with big data workloads and distributed computing. - Knowledge of modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
You have over 8 years of experience and are currently based in Balewadi, Pune. You possess a strong understanding of Data Architecture and are adept at leading data-driven projects. Your expertise lies in Data Modelling paradigms such as Kimball, Inmon, Data Marts, Data Vault, Medallion, etc. You have hands-on experience with Cloud-Based data strategies, with a preference for AWS. Designing data pipelines for ETL is second nature to you. You excel in ingestion, transformation, and ensuring data quality. Proficiency in SQL is a must, particularly in PostGreSQL development, query optimization, and index design. You are skilled in working with intermediate to complex levels of SQL and Postgres PL/SQL for complex warehouse workflows. Your advanced SQL capabilities include using concepts like RANK, DENSE_RANK, and applying statistical concepts through SQL. You have experience with Postgres SQL extensions like PostGIS and are well-versed in writing ETL pipelines that combine Python + SQL. Familiarity with data manipulation libraries in Python such as Pandas, Polars, DuckDB is desirable. Additionally, you have experience in designing Data visualization using tools like Tableau and PowerBI. In terms of responsibilities, you actively participate in designing and developing features within the existing Data Warehouse. You provide leadership in establishing connections between Engineering, product, and analytics/data scientists teams. Your role involves designing, implementing, and updating batch ETL pipelines, defining and implementing data architecture, and collaborating with engineers and data analysts to create reliable datasets. You work with various data orchestration tools like Apache Airflow, Dagster, Prefect, and others. You thrive in a fast-paced start-up environment and are passionate about your work. While a background or experience in the telecom industry is a plus, it is not a requirement. Your affinity for automation and monitoring sets you apart in your role.,
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
maharashtra
On-site
As an experienced professional with 12-14 years of experience, your primary role will involve developing a detailed project plan encompassing tasks, timelines, milestones, and dependencies. You will be responsible for solutions architecture design and implementation, understanding the source, and outlining the ADF structure. Your expertise will be crucial in designing and scheduling packages using ADF. Facilitating collaboration and communication within the team is essential to ensure a smooth workflow. You will also be focusing on application performance optimization and monitoring resource allocation to ensure tasks are adequately staffed. It will be part of your responsibility to create detailed technical specifications, business requirements, and unit test report documents. Your role will require you to ensure that the project complies with best practices, coding standards, and technical requirements. Collaboration with technical leads to address technical issues and mitigate risks will be a key aspect of your job. Your primary skill set should revolve around Data Architecture, with additional expertise in Data Modeling, ETL, Azure Log Analytics, Analytics Architecture, BI & Visualization Architecture, Data Engineering, Costing Management, databricks, Datadog, Apache Spark, Azure Datalake, and Azure Data Factory. Your proficiency in these areas will be instrumental in successfully executing your responsibilities.,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. Additionally, you will lead a high-performing team, foster a collaborative and innovative culture, and ensure data integrity, consistency, and availability across the organization. Our existing MDM solution is based on Microsoft Data Lake gen 2, Snowflake as the DWH, and Power BI managing data from most of our core applications. You will be managing the existing solution and driving further development to handle additional data and capabilities, as well as supporting our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Take part in developing and implementing the MDM and analytics strategy aligned with the overall team and organizational goals. - Collaborate with the Enterprise architect to align on the overall strategy and application landscape securing that MDM and data analytics fit into the overall ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives and support. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Actively develop business cases and proposals for IT investments and present them to senior management, executives, and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management and stakeholders. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Management advisory skills, such as strategic thinking, problem-solving, business acumen, stakeholder management, and change management. - Strong knowledge of master data management concepts, data governance, data technology, data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels. - Team player, result-oriented, structured, attention to detail, drive for accuracy, and strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting as well as presenting. - Strong executional skills to make things happen, not generate ideas alone but also getting things done of value for the entire organization. - Proven experience in working with analytics tools as well as data ingestion and platforms like Power BI, Azure Data Lake, Snowflake, etc. - Experience in working in any MDM solution and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required. Join us at the ASSA ABLOY Group, where our innovations make spaces physical and virtual safer, more secure, and easier to access. As an employer, we value results and empower our people to build their career around their aspirations and our ambitions. We foster diverse, inclusive teams and welcome different perspectives and experiences.,
Posted 3 weeks ago
15.0 - 25.0 years
20 - 27 Lacs
Noida, Chandigarh, Hyderabad
Work from Office
Total experience of 15 years with at least 5 years experience in solution architecture. Should have designed and implemented enterprise level high-performance, secure, microservices-based systems on .NET/Java/Python. Experienced in defining architecture frameworks APIs, integrations, cloud services, data pipelines aligned with business goals. Hands-on experience with cloud platforms such as Azure, AWS, or GCP, including containerization. technologies like Docker and Kubernetes. Strong understanding of data architecture, ETL processes, and analytics platforms. Should be able to architect cloud-native solutions using AWS, Azure, GCP leveraging containers (Docker/Kubernetes) and infrastructure-as-code (Terraform, ARM). Lead and mentor cross-functional teams, fostering a culture of innovation and continuous improvement. Stay current with emerging tech AI/ML, RPA, generative AI, vector search and advise on adaptation. Collaborate with clients to understand their business needs and translate them into technical solutions that drive value. Experience in Agile software development methodologies. Soft Skills: Excellent problem-solving abilities, communication skills, and a proactive approach to stakeholder management. Certification: Relevant certifications such as Azure Solutions Architect, AWS Certified Solutions Architect, or TOGAF.
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As the Senior Director - Enterprise Head of Architecture at AstraZeneca, you will play a crucial role in shaping the architecture landscape across the Enterprise Capabilities & Solutions (ECS) domain. Your responsibilities will include collaborating with other Heads of Architecture to align ECS architectures with AstraZeneca's business and IT strategies. You will be instrumental in developing and implementing AZ standards, patterns, and roadmaps to ensure architectural conformity within the ECS landscape. Additionally, you will work closely with key business teams, IT Leadership, and other Heads of Architecture to drive AstraZeneca's Digital and Architectural vision in line with business strategies. Your role will involve providing Enterprise Design thinking and support across the Enterprise, application, and infrastructure domains of architecture. You will define the architecture strategy, direction, and standards specific to the ECS Segment, with a focus on Data & Analytics architecture capabilities and strategy. Leading a team of Architects, you will oversee the end-to-end delivery of architecture solutions, develop function-specific reference architectures, and manage relationships with Data & Analytics leadership to influence the adoption of enterprise architecture frameworks. As a strategic advisor and authority within the architecture community, you will refine architecture strategies, standards, and patterns as needed, ensuring continuous alignment with business priorities. You will lead the development of Architecture Roadmaps and Blueprints, contribute to multi-functional decision-making bodies, and act as a sign-off authority for solution architectures and roadmaps. Championing ECS EA initiatives, you will engage with external architecture authorities, drive the adoption of new technology, and ensure regulatory compliance while fostering team engagement. To excel in this role, you should possess a Bachelor's degree in Computer Science or a related field, along with extensive experience in ECS solutions and a blend of data architecture, analysis, and engineering skills. Knowledge of cloud-based containerization strategies, data modeling technologies, and industry architectural patterns is essential. Desirable skills include a post-graduate degree in MIS, experience in Agile data definition scrums, and familiarity with metadata cataloguing tools and Cloud Economics. If you are ready to make a difference and contribute to redefining the development of life-changing medicines while embracing digital technology and data solutions, join us at AstraZeneca in our unique and daring world. Apply now to be a part of our journey towards becoming a digital and data-led enterprise.,
Posted 3 weeks ago
10.0 - 15.0 years
15 - 19 Lacs
Pune
Work from Office
At Solidatus , we re changing how organisations understand their data. We re an award-winning, venture-backed software company often called the Git for Metadata. Our platform helps businesses harvest, model, and visualise complex data lineage flows. Our unique lineage-first approach, combined with active AI development, provides organisations with unparalleled clarity and robust control over their data s journey and meaning. As a growing B2B SaaS business ( great, collaborative culture. Join us as we expand globally and define the future of data understanding! Role Overview We are seeking an experienced Data Architect to lead the design and implementation of data lineage solution s that align with our clients business objectives . This role involves collaborating with cross-functional teams to ensure the integrity, accuracy and timeliness of their data lineage solution . You will be working directly with our clients helping them get the maximum value from our product and ensuring they achieve their contractual goals. Key Responsibilities Design and implement robust data lineage solutions that support business intelligence, analytics, and data governance initiatives. Collaborate with stakeholders to understand data lineage requirements and translate them into technical and business solutions . Develop and maintain lineage data models, semantic metadata systems, and data dictionaries. Ensure data quality, security, and compliance with relevant regulations. Understand Solidatus implementation and data lineage modelling best practice and ensure that it is followed at our clients. Stay abreast of emerging technologies and industry trends to continuously improve data lineage architecture practices. Qualifications Bachelors or Masters degree in Computer Science , Information Systems, or a related field. Proven experience in data architecture, with a focus on large-scale data systems with more than one company. Proficiency in data modelling , database design, and data warehousing concepts. Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark). Strong understanding of data governance, data quality, and data security principles. Excellent communication and interpersonal skills, with the ability to work effectively in a collaborative environmen t Why Join Solidatus Be part of an innovative company shaping the future of data management. Collaborate with a dynamic and talented team in a supportive work environment. Opportunities for professional growth and career advancement. Flexible working arrangements, including hybrid work options. Competitive compensation and benefits package.
Posted 3 weeks ago
3.0 - 6.0 years
16 - 18 Lacs
Pune
Work from Office
In this role, you will be part of product development team to manage & deliver new product functionalities, modify existing product s functionalities or improve product functionalities as required. The Developer will work with Product manager & Engineering Manager with minimum technical guidance in the Software development team lead for the design, development and test of software programs for various cloud ecosystem. You will work within a multi-disciplined engineering team consisting of electronics engineers, mechanical engineers, firmware engineers, software engineers, programmers and scientists focusing on applied research and new technology innovations to provide new and improved products and IOT solutions for our customer in Building Management System domain. How you will do it Provide third-level support to branch technicians & engineers. Maintain released products & Data Pipelines. Liaise with other departments including Product Support, Technical Authors & SQA Design software code, technical specifications & feasibility study. Participate in Analysis, code & unit testing. Identify, analyze, and resolve complex cloud IOT inadequacies. Review and provide feedback on product functional specifications. Participate in assisting Compliance, Approvals, factory testing with any necessary support. Participate in product development meetings, design reviews and code reviews. Prepare the documentation as per ISO QMS guidelines & Participate in Quality Management System reviews Makes recommendations for changes to product development guidelines & standards. Comply with established development guidelines and standards. Develop an in-depth understanding of the development realm through interaction with other groups, communication with external experts and suppliers and independent research. Work for estimation, design, analysis, coding, and unit testing. What we look for 3 - 6 years of relevant Data pipelines design, development, and testing experience. Product development experience preferred. Working knowledge on building automation and industrial automation systems will be added advantage. Skills Experience 3+ years in Big Data development. Technologies Proficiency with Snowflake, Postgres, Apache Spark, KSQL, OpenTable Formats and Flink. Data Retention Knowledge of hot and cold storage solutions. Building Management Systems Experience in integrating data-driven insights. Collaboration Ability to work with cross-functional teams. Data Governance Strong understanding of data governance practices. Responsibilities Designing Data Management Frameworks Develop and implement data strategies, create data models, and manage data warehouses. Ensuring Data Security and Compliance Implement access controls, encryption, and other security measures. Implementing Data Management Processes Oversee data systems health, define KPIs, and recommend system enhancements. Building Data Models and Strategies Construct data models and devise strategies for data management. Collaborating Across Teams Work with stakeholders to ensure data architecture meets organizational needs. Research and Development Stay updated on data management trends and explore new tools.
Posted 3 weeks ago
2.0 - 7.0 years
6 - 11 Lacs
Noida, Mohali, Bengaluru
Work from Office
Database Engineer Mohali, Noida, Bangalore/ India 2+ years experience 1 Position Job Information: Work Experience: 2+ years Industry: IT Services Job Type: FULL TIME Location: Mohali, Noida, Bangalore / India Role Overview: We are seeking a proactive and eager-to-learn NoSQL DBA and Data Warehouse Specialist with a focus on the Amazon Web Services (AWS) ecosystem. In this role, you will support the management of cloud-native databases and assist in building and maintaining the customer s data warehouse infrastructure on AWS. This position offers a hands-on opportunity to grow in the fields of cloud database administration and modern data architecture. Key Responsibilities: AWS NoSQL Database: Assist in maintaining AWS-managed NoSQL databases such as Amazon DynamoDB, Amazon ElastiCache (Redis/Memcached), and Amazon DocumentDB. Monitor database metrics using Amazon CloudWatch and basic performance tools. Help manage database backup, recovery, and security configurations using AWS Backup and IAM roles/policies. Collaborate with DevOps teams to support database deployments via AWS CloudFormation or Terraform. AWS Data Warehouse: Support the implementation and maintenance of data warehouse solutions using Redshift and Databricks. Help build and monitor ETL/ELT pipelines using AWS Glue, Amazon S3, Lambda, and Step Functions. Assist in data loading, cleaning, validation, and documentation. Work with BI and analytics teams to provide clean, query-ready datasets for reporting. Requirements: Technical Skills: Bachelor s degree in Computer Science, Data Engineering, or a related field (or equivalent experience). Familiarity with AWS services, especially DynamoDB, S3, Lambda, and Redshift. Basic understanding of SQL and NoSQL database concepts. Exposure to scripting languages such as Python or Bash. Understanding of core AWS concepts: IAM, VPC, CloudWatch, and S3. Soft Skills: Strong attention to detail and a passion for learning cloud technologies. Good problem-solving and troubleshooting skills. Ability to take direction, collaborate, and communicate effectively. Comfortable working in a fast-paced, team-oriented environment. Preferred Skills : AWS Certified Cloud Practitioner or working towards AWS Developer/Database certifications. Experience with AWS Glue or Step Functions. Internship or academic project using AWS services or cloud databases. Exposure to data visualization tools like QuickSight, Tableau, or Power BI. Interview Process Internal Assessment Technical Round 1 Technical Round 2
Posted 3 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Thiruvananthapuram
Work from Office
Collaborate with business stakeholders to gather and translate data requirements into analytical solutions. Analyze large and complex datasets to identify trends, patterns, and actionable insights. Design, develop, and maintain interactive dashboards and reports using Elasticsearch/Kibana or Power BI. Conduct ad-hoc analyses and deliver data-driven narratives to support business decision-making. Ensure data accuracy, consistency, and integrity through rigorous validation and quality checks. Write and optimize SQL queries, views, and data models for reporting and analysis. Present findings through compelling visualizations, presentations, and written summaries. Work closely with data engineers and architects to enhance data pipelines and infrastructure. Contribute to the development and standardization of KPIs, metrics, and data governance practices Required Skills (Technical Competency): Bachelor or master degree in data science, Computer Science, Statistics, or a related field. 5+ years of experience in a data analyst or business intelligence role. Proficiency in SQL and data visualization tools such as Power BI, Kibana, or similar. Proficiency in Python, Excel and data storytelling. Understanding of data modelling, ETL concepts, and basic data architecture. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management skills To adhere to the Information Security Management policies and procedures. Desired Skills: Elasticsearch/Kibana, Power BI, AWS, Python, SQL, Data modelling, Data analysis, Data quality checks, Data validation, Data visualization, Stakeholder communication, Excel, Data storytelling, Team collaboration, Problem-solving, Analytical thinking, Presentation skills, ETL concepts.
Posted 3 weeks ago
2.0 - 5.0 years
3 - 7 Lacs
Tirodi, Mumbai
Work from Office
Job Description Overview - At Shaadi, we always put our users first. We start by looking at things from the user s perspective and end by evaluating how the solution has impacted the user. We are looking for People who are continuously adapting to new technologies and excited to work on products that influence millions of people every day. The Shaadi.com Android and iOS mobile applications are used by millions of people around the world and are some of India s best known and most loved applications and we re looking for someone to lead the engineering teams that build these apps. Role - We are looking for a Software Engineer iOS. This is a front-end role, but not limited to it. You will be learning a lot about core iOS development along with other mobile technologies too. Also, we believe in extreme ownership! And to be honest, everyone loves working with kind and smart people. We are building a kick-ass team with humble and empathetic talent. What you will do in this role Write performing code with End-2-End tests following TDD methodology. Ship projects continuously and on time. Build well suited Design Patterns. Work on Swift UI, Extensions & Widgets. Animations & Motion effects to deliver greater UX. In-App Subscription services. Understand the specifications from product, design, and QA - draft a solution followed by a team discussion on feasibility, architecture, design, etc. before implementations. What you should have 2 to 5 years of development experience of consumer products with hands-on experience in designing, developing and testing applications. Experience in Swift, Auto Layouts, TDDs and willingness to learn more. Well versed with Core Data, Architecture & Design Patterns, Data Structures and Algorithms, etc. Passion for finding and sharing best practices and driving discipline for superior code quality. Working knowledge of Xcode & code signing. BE (Comp/IT), ME (Comp/IT), MCA, M.Tech, B.Tech
Posted 3 weeks ago
8.0 - 15.0 years
14 - 19 Lacs
Pune
Work from Office
Ensure compliance with architectural principles and development standards Ensure solution designs address performance, availability, security and supportability challenges, as well as business functional requirements Work with colleagues from partner teams globally to translate business and technical requirements into solutions Ensure DevSecOps automation strategies for all solutions Ensure successful delivery of solutions into Production environment Provide support for live IT services Carries out activities that are large in scope, cross-functional and technically difficult. Take an active role in the mentoring and development of more junior resources Drive Engineering Excellence through Non-functional aspects. Develop data integration interfaces, APIs and micro services Develop test automation suites for API testing Design detailed solutions based on tech stack detailed below Work with tech stakeholders across multiple systems and regions Requirements 8 to 15 years of strong experience on technical stack - Java, micro services, API and Advanced SQL for data analytics Experience on GCP Big query, Pubsub and monitoring tools. Good to have experience in Air Flow implementation Proven experience on IT service delivery using Agile methodology along with automation testing framework. Experience in finance domain and preferred experience in Custody or Asset Servicing. Strong interpersonal capabilities and a team player. Excellent communication in both written and verbal in English, conflict management and problem-solving skill. Experience of architecture, change and operational aspects of technology. Proven ability to work across regions whilst maintaining a global perspective. Strong understanding of technology and IT application, up to date with latest technology trends and ideas in the wider market. Exposure to Java / data is advantage Can understand, build and present business cases and technical solution / design to senior stakeholders, business sponsor or clients. Sector functional requirements: Custody and Asset Servicing. Significant experience in the Custody and/or Asset Servicing domain, preferable on BaNCS or similar vendor products. Preferably experience of Securities Services, Custody or Securities Operations Technology in a Bank. Proven implementation of Client journey and design thinking Used to translate stakeholder aspiration into technical design. Familiarity with Financial Markets and related asset classes will be an advantage. Understanding and awareness of appropriate corporate and regulatory policies Understanding and awareness of cutting age technologies including AI/ML Sector nice to have: Experience with large scale data architecture, across multiple or hybrid Cloud platform and 3rd Party applications. Hands on experience with Kubernetes (Configmap / Secrets / Hashicorp Vault / Helm Charts). Experience in Kafka (including Kafka Avro, concept of partitioning). Experience in Databases / Oracle, NoSQL (MongoDB, Postgres Document model). Commercial acumen and good risk management and mitigation skills. Positive, proactive and can do attitude. Managing change and/or technology in a Global Investment Banking environment will be advantage. Experience of integrating vendor platforms on complex business lines / functions will be advantage.
Posted 3 weeks ago
12.0 - 16.0 years
39 - 45 Lacs
Bengaluru
Work from Office
Seeking Solution Architect Data Engineering to lead design & implementation of scalable, robust & secure data solutions. You will play a pivotal role in defining architecture standards, designing modern data platforms & data strategies execution.
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The primary responsibility of this role is to evolve the WCS IT landscape using a data-driven approach. You will be tasked with defining, designing, and driving the data architecture roadmap for WCS IT, which includes enterprise data architecture. Collaborating closely with Product Owners, Business & IT stakeholders across the WCS IT landscape will be essential to understand requirements and provide solutions tailored to meet the needs. You will also be responsible for creating DFDs and data lineage diagrams to comprehend the current state and work with stakeholders to align with the future roadmap while adhering to HSBC defined standards. Understanding system integrations within and outside WCS IT applications, API integration, and various data formats like JSON, XML, etc., will be a crucial part of the role. Analyzing business requirements related to data flows across applications/systems and collaborating with Product owners and POD teams for development aligned with data architecture and designs will also be a key aspect. Additionally, designing and developing tools for automation and scripting for data extraction, transformation, and loading as per business requirements will be part of your responsibilities. Collaborating with Program, Department, and Enterprise architects of HSBC to drive data architecture and deliver expected business outcomes by designing, developing, and implementing solutions is vital. Following the DevOps model in day-to-day delivery and fostering innovation within the team by encouraging the development of new ideas, PoCs, exploring new technologies, participating in hackathons, and attending forums and sessions will be encouraged. Ensuring adherence to best practices, guidelines, and data security standards in the BFS industry is crucial. The ideal candidate for this role must have a vast experience in data architecture, defining and driving enterprise data architecture. Proficiency in understanding microservices architecture, REST APIs, reading and parsing JSON, XML, and other data structures, as well as developing an understanding of data attributes and values between systems is required. Experience with data modeling, DFD, Data lineage tools like Visio and reverse engineering tools is essential. Previous experience working in the Banking and Financial service industry, specifically with an MNC Bank of the size of HSBC, is mandatory. Moreover, the candidate must possess experience in analyzing large datasets, JSONs, XMLs, and other formats, writing scripts for data extraction, transformation, and loading using tools or developed automation tools. Working experience with databases like PostgreSQL, Oracle, MongoDB, and others is necessary. Familiarity with Agile and DevSecOps methodologies is a must. An inclination to explore new technologies, innovate beyond project work, and excellent communication skills both written and verbal are prerequisites for this role. About the Company: Purview is a leading Digital Cloud & Data Engineering company headquartered in Edinburgh, United Kingdom, with a presence in 14 countries, including India, Poland, Germany, Finland, Netherlands, Ireland, USA, UAE, Oman, Singapore, Hong Kong, Malaysia, and Australia. The company has a strong presence in the UK, Europe, and APEC regions, providing services to Captive Clients such as HSBC, NatWest, Northern Trust, IDFC First Bank, Nordia Bank, among others. Purview also supports various top-tier IT organizations to deliver solutions and workforce/resources. Company Info: 3rd Floor, Sonthalia Mind Space Near Westin Hotel, Gafoor Nagar Hitechcity, Hyderabad Phone: +91 40 48549120 / +91 8790177967 Gyleview House, 3 Redheughs Rigg, South Gyle, Edinburgh, EH12 9DQ. Phone: +44 7590230910 Email: careers@purviewservices.com Login to Apply!,
Posted 3 weeks ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Lead Data Engineer at EY, you will play a crucial role in leading large scale solution architecture design and optimization to provide streamlined insights to partners throughout the business. You will lead a team of Mid- and Senior data engineers to collaborate with visualization on data quality and troubleshooting needs. Your key responsibilities will include implementing data processes for the data warehouse and internal systems, leading a team of Junior and Senior Data Engineers in executing data processes, managing data architecture, designing ETL processes, cleaning, aggregating, and organizing data from various sources, and transferring it to data warehouses. You will be responsible for leading the development, testing, and maintenance of data pipelines and platforms to enable data quality utilization within business dashboards and tools. Additionally, you will support team members and direct reports in refining and validating data sets, create, maintain, and support the data platform and infrastructure, and collaborate with various teams to understand data requirements and design solutions that enable advanced analytics, machine learning, and predictive modeling. To qualify for this role, you must have a Bachelor's degree in Engineering, Computer Science, Data Science, or related field, along with 9+ years of experience in software development, data engineering, ETL, and analytics reporting development. You should possess expertise in building and maintaining data and system integrations using dimensional data modeling and optimized ETL pipelines, as well as experience with modern data architecture and frameworks like data mesh, data fabric, and data product design. Other essential skillsets include proficiency in data engineering programming languages such as Python, distributed data technologies like Pyspark, cloud platforms and tools like Kubernetes and AWS services, relational SQL databases, DevOps, continuous integration, and more. You should have a deep understanding of database architecture and administration, excellent written and verbal communication skills, strong organizational skills, problem-solving abilities, and the capacity to work in a fast-paced environment while adapting to changing business priorities. Desired skillsets for this role include a Master's degree in Engineering, Computer Science, Data Science, or related field, as well as experience in a global working environment. Travel requirements may include access to transportation to attend meetings and the ability to travel regionally and globally. Join EY in building a better working world, where diverse teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across various sectors.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
lucknow, uttar pradesh
On-site
About Agoda Agoda is an online travel booking platform that offers accommodations, flights, and more to travelers worldwide. With a global network of 4.7M hotels and holiday properties, as well as flights, activities, and more, we are dedicated to connecting travelers with seamless travel experiences. As part of Booking Holdings and based in Asia, our team of 7,100+ employees from 95+ nationalities across 27 markets creates a work environment that thrives on diversity, creativity, and collaboration. At Agoda, we foster a culture of innovation through experimentation and ownership, allowing our customers to explore and enjoy the world. Our Purpose: Bridging the World Through Travel We believe that travel enriches lives by providing opportunities to learn, experience, and appreciate the beauty of our world. By bringing people and cultures closer together, travel promotes empathy, understanding, and happiness. The Data Team at Agoda The Data department at Agoda is responsible for overseeing all data-related requirements within the company. Our primary objective is to enhance the utilization of data through innovative approaches and the implementation of robust resources such as operational and analytical databases, queue systems, BI tools, and data science technology. We recruit talented individuals from diverse backgrounds globally to tackle this challenge, providing them with the necessary knowledge and tools for personal growth and success while upholding our company's values of diversity and experimentation. The Data team at Agoda plays a crucial role in supporting business users, product managers, engineers, and others in their decision-making processes. We are committed to improving the search experience for our customers by delivering faster results and ensuring protection against fraudulent activities. The abundance of data available to us presents both a challenge and a reward, driving our passion for excellence within the Data department. The Opportunity As a senior data pipeline engineer at Agoda, you will be working on distributed systems that span multiple data centers, thousands of servers, and process hundreds of billions of messages daily. Ensuring data quality, integrity, and accuracy is fundamental to our operations. You will be involved in designing scalable systems to handle the increasing volume of data, including auditing and monitoring functionalities. This role provides you with the opportunity to lead projects with a small team, enhancing your ownership and leadership skills. You will tackle complex problems related to managing and interpreting large datasets, such as schema registry, real-time data-ingestion, cross-data center replication, data enrichment, storage, and analytics. In This Role, You'll Get to - Build, administer, and scale data pipelines processing hundreds of billions of messages daily across multiple data centers - Develop and enhance existing frameworks used by teams throughout Agoda to contribute messages to the data pipeline - Manage data ingestion into various systems (Hadoop, ElasticSearch, other Distributed Systems) - Create tools to monitor high data accuracy SLAs for the data pipeline - Explore new technologies to improve data quality, processes, and flow - Develop high-quality software through design reviews, code reviews, and test-driven development What You'll Need To Succeed - Bachelors degree in Computer Science, Information Systems, Computer Engineering, or a related field - 8+ years of industry experience, preferably in a tech company - Strong knowledge of data architecture principles - Experience in debugging production issues - Proficient in coding and building purpose-driven, scalable, well-tested, and maintainable systems - Detail-oriented with a focus on considering all outcomes of decisions - Excellent communication skills in technical English, both verbally and in writing - Proficiency in multiple programming languages (e.g., Golang, Java, Scala, Python, C#) - Good understanding of Kafka and experience as a Kafka Administrator - Experience with data ingestion from Kafka into Hadoop, ElasticSearch, and other Distributed Systems - Strong systems administration skills in Linux - Previous involvement in or contribution to Open Source Projects Equal Opportunity Employer Agoda is an equal opportunity employer. We value diversity and welcome applications from individuals with a variety of backgrounds and experiences. We will retain your application for future vacancies and allow you to request the removal of your details if desired. For more information, please refer to our privacy policy. Note: Agoda does not accept third-party resumes. Kindly refrain from sending resumes to our jobs alias, Agoda employees, or any other organizational location. Agoda will not be liable for any fees associated with unsolicited resumes.,
Posted 3 weeks ago
15.0 - 21.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Architect with over 15 years of experience, your primary responsibility will be to lead the design and implementation of scalable, secure, and high-performing data architectures. You will collaborate with business, engineering, and product teams to develop robust data solutions that support business intelligence, analytics, and AI initiatives. Your key responsibilities will include designing and implementing enterprise-grade data architectures using cloud platforms such as AWS, Azure, or GCP. You will lead the definition of data architecture standards, guidelines, and best practices while architecting scalable data solutions like data lakes, data warehouses, and real-time streaming platforms. Collaborating with data engineers, analysts, and data scientists, you will ensure optimal solutions are delivered based on data requirements. In addition, you will oversee data modeling activities encompassing conceptual, logical, and physical data models. It will be your duty to ensure data security, privacy, and compliance with relevant regulations like GDPR and HIPAA. Defining and implementing data governance strategies alongside stakeholders and evaluating data-related tools and technologies are also integral parts of your role. To excel in this position, you should possess at least 15 years of experience in data architecture, data engineering, or database development. Strong experience in architecting data solutions on major cloud platforms like AWS, Azure, or GCP is essential. Proficiency in data management principles, data modeling, ETL/ELT pipelines, and modern data platforms/tools such as Snowflake, Databricks, and Apache Spark is required. Familiarity with programming languages like Python, SQL, or Java, as well as real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub, will be beneficial. Moreover, experience in implementing data governance, data cataloging, and data quality frameworks is important. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills are necessary for this role. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is preferred, along with certifications like Cloud Architect or Data Architect (AWS/Azure/GCP). Join us at Infogain, a human-centered digital platform and software engineering company, where you will have the opportunity to work on cutting-edge data and AI projects in a collaborative and inclusive work environment. Experience competitive compensation and benefits while contributing to experience-led transformation for our clients in various industries.,
Posted 3 weeks ago
0.0 - 4.0 years
0 Lacs
haryana
On-site
As the Financial Services (FSO) division of Ernst & Young, you will have the unique opportunity to be part of a professional services organization dedicated exclusively to the financial services marketplace. Joining our multi-disciplinary teams from around the world, you will play a crucial role in delivering a global perspective. Aligned with key industry groups such as asset management, banking and capital markets, insurance, and private equity, we offer integrated advisory, assurance, tax, and transaction services. Through diverse experiences, world-class learning opportunities, and individually tailored coaching, you will undergo continuous professional development. Our focus is on developing exceptional leaders who collaborate effectively to fulfill our commitments to all stakeholders, thereby contributing significantly to building a better working world for our people, clients, and communities. Excited to be a part of this journey This is just the beginning, as the exceptional EY experience will stay with you for a lifetime. As a future FSO Technology Consultant at EY, you will be part of a team that helps clients navigate complex industry challenges and leverage technology to enhance business operations. Your role will involve addressing business and strategic challenges such as business and solution architecture, digital transformation, project management, and design of digital operating models. Additionally, you will work on technical matters including data science, advanced analytics, IoT, data governance, blockchain, artificial intelligence, and robotic process automation. Joining our team means working on critical projects within the financial services landscape, with opportunities to transition between teams as both you and our dynamic business continue to grow and evolve. Your contributions will be instrumental in propelling EY to new heights. We are currently seeking individuals for the following positions: - Cybersecurity - Digital - Platform - Data & Analytics To qualify for a role in our team, you must have: - A Bachelor's or Master's Degree in (Business) Engineering, Computer Science, Information Systems Management, Mathematics, (applied) Economics, or a related field with an interest in cutting-edge technologies. - Strong analytical skills. - Knowledge of project management methodologies, including agile, traditional, and hybrid approaches. - Proficiency in English at an advanced level. - Experience in team leadership. - Exceptional oral and written communication abilities. If you believe you meet the above criteria, we encourage you to apply at your earliest convenience. The exceptional EY experience awaits you, ready for you to shape and build upon.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Lead Software Engineer at JPMorgan Chase within Asset and Wealth Management, you play a crucial role in an agile team dedicated to enhancing, building, and delivering cutting-edge technology products in a secure, stable, and scalable manner. Your primary responsibility lies in developing innovative technology solutions across various technical domains to support the firm's business objectives effectively. Your key job responsibilities include creating, managing, and updating accurate Architecture Current, Target state, and Target State Roadmaps for your application portfolio. You are expected to leverage your expertise as a business domain expert to align technical capabilities with the business strategy, ensuring the realization of desired business outcomes. Additionally, you will collaborate with product owners and application teams to establish and maintain business process flows for the portfolio. You will also take ownership of data domains, data products, and data models in coordination with product owners, data owners, and application teams. Furthermore, you will actively participate in data & domain architecture governance bodies, evaluate new technologies, and provide valuable feedback. Your role involves devising creative data architecture solutions, conducting design and development activities, and troubleshooting technical issues with a forward-thinking mindset. You will identify opportunities for process automation to enhance the operational stability of software applications and systems. Moreover, you will lead evaluation sessions with external vendors, startups, and internal teams to assess data architectural designs and their applicability within the existing systems and information architecture. Additionally, you will spearhead data architecture communities of practice to promote the adoption of modern data architecture technologies. To excel in this role, you must possess formal training or certification in software engineering concepts along with at least 5 years of practical experience. Ideal candidates will have prior experience in Wealth Management technology, encompassing Wealth Planning & Advice, Investing, Lending, and Banking, with proficiency across various asset classes such as Fixed Income, Equities, and Alternatives. A degree in Computer Science, Engineering, or a related field is preferred. Your skillset should include a strong command of software development methodologies, architecture frameworks, design patterns, testing practices, and operational stability. Effective leadership, communication, and problem-solving capabilities are essential, as well as the ability to establish robust engineering communities and guilds. Demonstrated experience in influencing cross-functional teams to deliver modern architecture solutions is highly valued.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a Lead Platform Engineer at our esteemed organization, you will play a pivotal role in designing and constructing cloud-based distributed systems that tackle intricate business dilemmas for some of the largest companies globally. Leveraging your profound expertise in software engineering, cloud engineering, and DevOps, you will be instrumental in crafting technology stacks and platform components that empower cross-functional AI Engineering teams to develop robust, observable, and scalable solutions. Being part of a diverse and globally dispersed engineering team, you will engage in the complete engineering lifecycle, encompassing the design, development, optimization, and deployment of solutions and infrastructure on a scale that caters to the needs of the world's leading corporations. Your core responsibilities will revolve around: - Crafting cloud solution and distributed systems architecture for full stack AI software and data solutions - Implementing, testing, and managing Infrastructure as Code (IAC) of cloud-based solutions, inclusive of CI/CD, data integrations, APIs, web and mobile apps, and AI solutions - Defining and executing scalable, observable, manageable, and self-healing cloud-based solutions across AWS, Google Cloud, and Azure - Collaborating with diverse teams, including product managers, data scientists, and fellow engineers, to define and implement analytics and AI features that align with business requirements and user needs - Harnessing Kubernetes and containerization technologies to deploy, manage, and scale analytics applications in cloud environments, ensuring optimal performance and availability - Developing and maintaining APIs and microservices to expose analytics functionality to both internal and external consumers, adhering to best practices for API design and documentation - Implementing robust security protocols to safeguard sensitive data and uphold compliance with data privacy regulations and organizational policies - Continuously monitoring and troubleshooting application performance, identifying and resolving issues that impact system reliability, latency, and user experience - Participating in code reviews and contributing to the establishment and enforcement of coding standards and best practices to ensure high-quality, maintainable code - Staying abreast of emerging trends and technologies in cloud computing, data analytics, and software engineering, and proactively identifying opportunities to enhance the capabilities of the analytics platform - Collaborating closely with and influencing business consulting staff and leaders as part of multi-disciplinary teams to assess opportunities and develop analytics solutions for Bain clients across various sectors - Influencing, educating, and directly supporting the analytics application engineering capabilities of our clients To be successful in this role, you should possess: - A Master's degree in Computer Science, Engineering, or a related technical field - 6+ years of experience, with at least 3+ years at the Staff level or equivalent - Proven expertise as a cloud engineer and software engineer in either product engineering or professional services organizations - Experience in designing and delivering cloud-based distributed solutions, with GCP, AWS, or Azure certifications being advantageous - Deep familiarity with software development lifecycle nuances - Proficiency in one or more configuration management tools such as Ansible, Salt, Puppet, or Chef - Proficiency in one or more monitoring and analytics platforms like Grafana, Prometheus, Splunk, SumoLogic, NewRelic, DataDog, CloudWatch, Nagios/Icinga - Experience with CI/CD deployment pipelines (e.g., Github Actions, Jenkins, Travis CI, Gitlab CI, Circle CI) - Experience in building backend APIs, services, and/or integrations using Python - Practical experience with Kubernetes through services like GKE, EKS, or AKS is beneficial - Ability to collaborate effectively with internal and client teams and stakeholders - Proficiency in Git for versioning and collaboration - Exposure to LLMs, Prompt engineering, Langchain is a plus - Experience with workflow orchestration tools like dbt, Beam, Airflow, Luigy, Metaflow, Kubeflow, or any other - Experience in the implementation of large-scale structured or unstructured databases, orchestration, and container technologies such as Docker or Kubernetes - Strong interpersonal and communication skills, enabling you to explain and discuss complex engineering technicalities with colleagues and clients from different disciplines at their level of cognition - Curiosity, proactivity, and critical thinking - Sound knowledge of computer science fundamentals in data structures, algorithms, automated testing, object-oriented programming, performance complexity, and the implications of computer architecture on software performance - Strong understanding of designing API interfaces - Knowledge of data architecture, database schema design, and database scalability - Familiarity with Agile development methodologies At our organization, Bain & Company, a global consultancy dedicated to assisting the world's most ambitious change-makers in shaping the future, we operate across 65 cities in 40 countries. Collaborating closely with our clients, we work as one team with a shared objective of achieving exceptional results, outperforming competitors, and redefining industries. Our commitment to investing more than $1 billion in pro bono services over the next decade reflects our dedication to supporting organizations addressing pressing challenges in education, racial equity, social justice, economic development, and the environment. With a platinum rating from EcoVadis, a prominent platform for environmental, social, and ethical performance ratings for global supply chains, we stand in the top 1% of all companies. Since our inception in 1973, we have gauged our success by the success of our clients and proudly maintain the highest level of client advocacy in the industry.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France