Jobs
Interviews

1769 Data Architecture Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

6 - 10 Lacs

Chennai

Remote

We are looking for a highly skilled Senior SQL Developer with strong ETL development experience and a solid background in data analysis . The ideal candidate will play a key role in designing and optimizing data pipelines, developing robust SQL queries, and transforming complex data sets into meaningful business insights. This position requires a combination of technical expertise, problem-solving skills, and a strategic mindset to support data-driven decision-making across the organization. Key Responsibilities: Design, develop, and optimize complex SQL queries, stored procedures, functions, and views for data extraction and reporting. Develop and maintain scalable ETL pipelines using tools such as Informatica, Talend, or custom scripts (Python, etc.). Collaborate with data architects, business analysts, and stakeholders to understand business requirements and deliver reliable data solutions. Analyze large datasets to uncover trends, identify anomalies, and support advanced analytics and reporting initiatives. Ensure data quality and integrity by performing thorough data validation and error handling. Monitor and optimize performance of SQL queries and ETL workflows. Participate in database design, modeling, and data warehouse architecture improvements. Document data flows, data models, and technical specifications. Mentor junior developers and contribute to code reviews and best practices. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Engineering, or a related field. 5+ years of experience in SQL development and ETL processes. Proficiency in writing complex T-SQL (or PL/SQL) queries and performance tuning. Hands-on experience with ETL tools such as nformatica, Talend, or similar. Strong experience in working with relational databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL). Analytical mindset with experience in translating business requirements into data solutions. Experience with data warehousing concepts and dimensional data modeling. Proficient in data visualization and reporting tools such as Power BI or Tableau. Solid understanding of data governance, security, and compliance standards. Preferred: Experience with cloud-based data platforms (Azure Data Factory, AWS Glue, Google Cloud Dataflow). Knowledge of scripting languages like Python or shell scripting. Experience with Agile or DevOps methodologies. Strong understanding of business domains such as finance, healthcare, or e-commerce (if industry-specific) Work Environment: Remote work flexibility. Cross-functional team collaboration with data engineers, BI analysts, and business teams. Opportunities to work on enterprise-level data projects and emerging technologies. Please share your resume to srividyap@hexalytics.com.

Posted 4 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Pune, Gurugram

Work from Office

What youll do: Lead the technical team consisting of Developers, Senior Engineers, etc. to solve the business problems and lead end to end technical delivery in collaboration with Project Manager Ownership to ensure the proposed design/ architecture, deliverables meets the client expectation and solves the business problem with high degree of quality Translate complex client expectations and business problem into technical requirements. Own end to end responsibility to lead project across all phases right from discovery/ POC through build, SIT and UAT phases Lead one or more projects at a time, based on the role and time commitment required for the project Partner with Senior Leadership team and assist in project management responsibility i.e. Project planning, staffing management, people growth, etc.; Work in tandem with global counterparts in planning and accomplishing planned activities, identification or risks and mitigation strategy; Build relationship with client stakeholders and lead presentations related to project deliverables, design brainstorming / discussion, status updates, innovation/ improvements, etc.; Collaborate with other ZS internal expertise teams - Architects, Validation/ testing, etc. to ensure best in class technology solution; Outlook for continuous improvement, innovation and provide necessary mentorship and guidance to the team; Liaison with Staffing partner, HR business partners for team building/ planning; Assist Senior Leadership on building POV on new technology or problem solving, Innovation to build firm intellectual capital: Lead the project deliverables such as business case development, solution vision and design, user requirements, solution mockup, prototypes, and technical architecture, test cases, deployment plans, operations strategy and planning, etc.; Actively lead unstructured problem solving to design and build complex solutions, tune to meet expected performance and functional requirements; Lead appropriate documentation of systems design, procedure, SOP, etc.; Build cloud applications using serverless technologies like custom web applications/ETL pipelines/ real time/stream analytics applications etc Leverage expertise / experience of both traditional and modern data architecture and processing concepts, including relational databases. What youll bring: Bachelor's/Master's degree with specialization in Computer Science, MIS, IT or other computer related disciplines; 5+ years of relevant consulting-industry experience working on medium-large scale technology solution delivery engagements 5+ years of hands on experience of designing, implementation Data processing/data management solutions. Should have strong expertise in creating High Level and Detailed Design documents Good handle on working in distributed computing and cloud services platform (but not limited to) - AWS, Azure, GCP. Should have experience of working on Agile delivery framework and can mentor and coach team to follow agile best practices Expertise in one of the Programming languages like Python, Scala, etc. and should be able to review the codes created by developers Expertise in commonly used AWS services (or equivalent services in Azure) is preferred EMR, Glue, EC2, Glue ETL, Managed Airflow, S3, LakeFormation, SageMaker Studio, Athena, Redshift, RDS, AWS Neptune Experience in building project plans/sprint plans for the technical team, estimating project timelines and effort, work distribution to junior team members and tracking project progress Lead project teams in driving end to end activities to meet set milestones and provide necessary mentorship/ guidance for the team growth; Is comfortable with all the SDLC documentation required to support technical delivery and can work with relevant teams to build all these SDLC documents Additional skills: Capable of managing a virtual global team for the timely delivery of multiple projects Experienced in analyzing and troubleshooting interactions between databases, operating systems, and applications Travel to global offices as required to collaborate with clients and internal project teams

Posted 4 days ago

Apply

10.0 - 14.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Join our digital revolution in NatWest Digital X In everything we do, we work to one aim To make digital experiences which are effortless and secure, So we organise ourselves around three principles: engineer, protect, and operate We engineer simple solutions, we protect our customers, and we operate smarter, Our people work differently depending on their jobs and needs From hybrid working to flexible hours, we have plenty of options that help our people to thrive, This role is based in India and as such all normal working days must be carried out in India, Job Description Join us as a Payments Data Principal Engineer As a key member of our Payments Technology senior leadership team, youll oversee the safe and high-quality delivery of changes across diverse payment technology environments, with a strong emphasis on data engineering and Payments data products Your focus will extend to the delivery of changes from a people, process, and technology perspective, with the aim of embedding a new customer and product led delivery model that promotes engineering excellence, including robust data architecture and modelling practices This role offers you the opportunity to enhance your career profile and forge long-lasting relationships with a variety of stakeholders, including senior management, through effective data-driven decision-making We're offering this role at director level What you'll do As a Senior Principal Engineer, youll be accountable for the efficiency, stability, enhancement, responsiveness, performance, security, support, life-cycling and maintenance of Payments Technologys systems, applications, utilities, tools, and services with a particular focus on the integrity and usability of data assets Day-to-day, youll govern the implementation and adherence to standards, principles, and policies, leading various disciplines Youll will collaborate closely with your peers to ensure the utilisation of optimal technical solutions within Payments and across the bank, particularly in relation to data management and analytics, With a fine blend of hands-on and hands-off engineering, youll coordinate, support and guide your teams to design and engineer software, scripts and tools that are customer centric, high performing, secure and robust And with a sharp focus on opportunities for automation, youll create efficiency where it doesnt already exist by exploring and anticipating the impact of strategic challenges, and developing the proposed strategy, architecture and roadmap to drive them forward, Youll also be responsible for: Developing and executing a cohesive strategy that aligns product, design and business priorities, and converting these into technological solutions Empowering and enabling your teams to grow by facilitating discussions, sharing insights, driving collaboration, and guiding the direction of work Working with software engineers to prototype innovative ideas, and engaging with our architects to validate and leverage them across the bank Developing design patterns, advocating our design principles and methodologies, and signposting the latest trends, technologies and tools Enhancing our software engineering capability and inspiring our community of engineers to fulfil their potential by facilitating internal mobility, shaping career paths and coaching talent The skills you'll need Youll bring a wealth of practical experience and technical knowledge to this role From knowledge of our applications, interfaces, services and platforms, to hands-on experience developing and implementing deployment patterns, application tooling and legacy and industry-leading technologies, youll have the knowledge, skills and insights to support and guide your teams to engineer innovative, value-adding solutions, And, by harnessing your ability to engage and rally people around a tech cause, youll enable success at both a team and executive level and make a long-lasting impact through your work, You'll demonstrate: Excellent leadership and management skills Significant experience of leading software development teams, executing strategies and implementing programming best practice Expertise in multiple high-level programming languages Great interpersonal and relationship building skills Good communication skills Show

Posted 4 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Join us as a Data & Analytics Analyst This is an opportunity to take on a purpose-led role in a cutting edge Data & Analytics team Youll be consulting with our stakeholders to understand their needs and identify suitable data and analytics solutions to meet them along with business challenges in line with our purpose Youll bring advanced analytics to life through visualisation to tell powerful stories and influence important decisions for key stakeholders, giving you excellent recognition for your work We're offering this role at associate vice president level What you'll do As a Data & Analytics Analyst, youll be driving the use of advanced analytics in your team to develop business solutions which increase the understanding of our business, including its customers, processes, channels and products Youll be working closely with business stakeholders to define detailed, often complex and ambiguous business problems or opportunities which can be supported through advanced analytics, making sure that new and existing processes are designed to be efficient, simple and automated where possible, As Well As This, Youll Be Leading and coaching your colleagues to plan and deliver strategic project and scrum outcomes Planning and delivering data and analytics resource, expertise and solutions, which brings commercial and customer value to business challenges Communicating data and analytics opportunities and bringing them to life in a way that business stakeholders can understand and engage with Adopting and embedding new tools, technologies and methodologies to carry out advanced analytics Developing strong stakeholder relationships to bring together advanced analytics, data science and data engineering work that is easily understandable and links back clearly to our business needs The skills you'll need Were looking for someone with a passion for data and analytics together with knowledge of data architecture, key tooling and relevant coding languages Along with advanced analytics knowledge, youll bring an ability to simplify data into clear data visualisations and compelling insight using appropriate systems and tooling , Youll Also Demonstrate Strong knowledge of data management practices and principles Experience of translating data and insights for key stakeholders Good knowledge of data engineering, data science and decisioning disciplines Data Analysis skills to identify Control gaps and measuring data quality against the standards, Automation and dashboarding of Data Quality metrics Practical knowledge of implementing Data Quality tools, Working experience in SQLs and Data Cloud environment Good understanding of Risk and Controls Frameworks Candidates must possess 8-10 years of experience Show

Posted 4 days ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Purpose of this Role You are responsible for defining commercially aware and technically astute solutions that both align to and inform architectural direction while balancing the typical constraints evident on project delivery, Your role is embedded within the Cigna International Markets Architecture function that works collaboratively with senior stakeholders to define strategic direction, thereafter, ensuring that intent is reflected in business solutions, You will be comfortable leading and defining effective business solutions within complex project environments, demonstrating the maturity to build strong working relationships across Business, IT, and 3rd Party stakeholders, Main Duties / Responsibilities Perform key enterprise-wide Data Architecture responsibilities within International Markets, focusing on our on-premise and cloud solution deployments, Proactively engage across Business, IT, and 3rd party stakeholders to ensure that the business investment delivers a cost-effective and appropriate data driven solutions for the organization, Assist sponsors in the creation of rounded and compelling business cases for change, Work with Solution Architects to drive the definition of the data solution design, mapping business and technical requirements to define data assets that meet both business and operational expectations, Own and manage data models, data design artefacts and provide guidance and consultancy on best practice and standards for customer focused data delivery and data management practices, Be an advocate for data driven design within an agile delivery framework, Actively participate in the full project lifecycle from early shaping of high-level estimates and delivery plans through to active governance of the solution as it is developed and built in later phases, Capture and manage risks, issues and assumptions identified through the lifecycle, articulating the financial and other impacts associated with these concerns, Take a lead role in the selection of 3rd Party solutions, developing successful partner relationships where required, Maintain an active awareness of emerging trends and developments in data design, architecture and enterprise technology that could impact or benefit our business and our customers, High-level mentoring of design & development teams to embed data architecture concepts, principles and best practices, Skills & Experience 10 years of IT experience and 5 years in a Data Architecture or Data Design role is required, Experience of leading data design delivering significant assets to an organization e-g Data Warehouse, Data Lake, Customer 360 Data Platform, Be able to demonstrate experience within some of the following data capabilities: Data modelling, database design (operational and / or analytical use cases), data migration, data quality management, metadata management, domain driven design, data integration, with a preference for ETL/ELT and data streaming experience Toolsets and platforms preferred are: AWS, SQL Server, Qlik toolsets, Collibra Track record of working successfully in a globally dispersed team would be beneficial, Breadth of experience and technical acumen across application, infrastructure, security, service management, business process, architecture capabilities, etc Highly collaborative and a desire to work with a broad range of stakeholders to achieve agreement on solutions that drive benefits for our customers and businesses, Commercial awareness incorporating financial planning and budgeting, About The Cigna Group Cigna Healthcare, a division of The Cigna Group, is an advocate for better health through every stage of life We guide our customers through the health care system, empowering them with the information and insight they need to make the best choices for improving their health and vitality Join us in driving growth and improving lives, Show

Posted 4 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Bengaluru

Work from Office

Responsibilities: Design and implement data architecture using Snowflake, Python, Salesforce, Informatica. Ensure data security and compliance standards met. Collaborate with cross-functional teams on project delivery.

Posted 4 days ago

Apply

9.0 - 13.0 years

9 - 15 Lacs

Chennai

Work from Office

Petrofac is a leading international service provider to the energy industry, with a diverse client portfolio including many of the worlds leading energy companies. We design, build, manage, and maintain infrastructure for our clients. We recruit, reward, and develop our people based on merit, regardless of race, nationality, religion, gender, age, sexual orientation, marital status, or disability. We value our people and treat everyone who works for or with Petrofac fairly and without discrimination. The world is re-thinking its energy supply and energy security needs and planning for a phased transition to alternative energy sources. We are here to help our clients meet these evolving energy needs. This is an exciting time to join us on this journey. Are you ready to bring the right energy to Petrofac and help us deliver a better future for everyone? JOB TITLE: Data Engineer KEY RESPONSIBILITIES: Architecting and defining data flows for big data/data lake use cases. Excellent knowledge on implementing full life cycle of data management principles such as Data Governance, Architecture, Modelling, Storage, Security, Master data, and Quality. Act as a coach and provide consultancy services and advice to data engineers by offering technical guidance, and ensuring architecture principles, design standards and operational requirements are met. Participate in the Technical Design Authority forums. Collaborates with analytics and business stakeholders to improve data models that feed BI tools, increasing data accessibility, and fostering data-driven decision making across the organization. Work with team of data engineers to deliver the tasks and achieving weekly and monthly goals, also to guide the team to follow the best practices and improve the deliverables. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability. Responsible for estimating the cluster size, core size, monitoring, and troubleshooting of the data bricks cluster and analysis server to produce optimal capacity for computing data ingestion. Deliver master data cleansing and improvement efforts; including automated and cost-effective solutions for processing, cleansing, and verifying the integrity of data used for analysis. Expertise in securing the big data environment including encryption, tunnelling, access control, secure isolation. To guide and build highly efficient OLAP cubes using data modelling techniques to cater all the required business cases and mitigate the limitation of Power BI in analysis service. Deploy and maintain highly efficient CI/CD devops pipelines across multiple environments such as dev, stg and production. Strictly follow scrum based agile approach of development to work based on allocated stories. Comprehensive knowledge on data extraction, Transformation and loading data from various sources like Oracle, Hadoop HDFS, Flat files, JSON, Avro, Parquet and ORC. Experience defining, implementing, and maintaining a global data platform Experience building robust and impactful data visualisation solutions and gaining adoption Extensive work experience onboarding various data sources using real-time, batch load or scheduled loads. The sources can be in cloud, on premise, SQL DB, NO SQL DB or API-based. Expertise in extracting the data through JSON, ODATA, REST API, WEBSERVICES, XML. Expertise in data ingestion platforms such as Apache Sqoop, Apache Flume, Amazon kinesis, Fluent, Logstash etc. Hands on experience in using Databricks, Pig, SCALA, HIVE, Azure Data Factory, Python, R Operational experience with Big Data Technologies and Engines including Presto, Spark, Hive and Hadoop Environments Experience in various databases including Azure SQL DB, Oracle, MySQL, Cosmos DB, MongoDB Experience supporting and working with cross-functional teams in a dynamic environment. ESSENTIAL QUALIFICATION & SKILLS: Bachelors degree (masters preferred) in Computer Science, Engineering, or any other technology related field 10+ years of experience in data analytics platform and hands-on experience on ETL and ELT transformations with strong SQL programming knowledge. 5+ years of hands-on experience on big data engineering, distributed storage and processing massive data into data lake using Scala or Python. Proficient knowledge on Hadoop and Spark eco systems like HDFS, Hive, Sqoop, Oozie, Spark core, streaming. Experience with programming languages such as Scala, Java, Python and Shell scripting Proven Experience in pulling data through REST API, ODATA, XML,Web services. Experience with Azure product offerings and data platform. Experience in data modelling (data marts, snowflake/Star, Normalization, SCD2). Architect and defining the data flows and building highly efficient, scalable data pipelines. To work in tandem with the Enterprise and Domain Architects to understand the business goals and vision, and to contribute to the Enterprise Roadmaps. Strong troubleshooting skills, problem solving skills of any issues stopping business progress. Coordinate with multiple business stake holders to understand the requirement and deliver. Conducting a continuous audit of data management system performance, refine whenever required, and report immediately any breach or loopholes to the stakeholders. Allocate task to various team members, track the status and provide the report on activities to management. Understand the physical and logic plan of execution and optimize the performance of data pipelines. Extensive background in data mining and statistical analysis. Able to understand various data structures and common methods in data transformation. Ability to work with ETL tools with strong knowledge on ETL concepts. Strong focus on delivering outcomes. Data management: modelling, normalisation, cleaning, and maintenance Understand Data architectures, Data warehousing principles and be able to participate in the design and development of conventional data warehouse solutions.

Posted 4 days ago

Apply

3.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

The Technical Architect will have to be execution-focused, supporting the full implementation lifecycle, from scoping to deployment in an evolving ecosystem consisting of clients and partners. You will be responsible for successfully solutioning the enterprise application E2E, designing and building the Salesforce Industry-specific Life Sciences, Health industry solutions. As a Technical Architect, you will become a deep product expert with Industry Salesforce Life Sciences Cloud (including Health Cloud) applications, and work closely with our sales and delivery teams to ensure customer success. You will lead the technical design and implementation of Salesforce solutions, ensuring compliance with industry regulations (HIPAA, GxP, etc.) and alignment with business goals. The ideal candidate combines deep Salesforce expertise with a strong domain understanding of pharmaceuticals, biotech, medtech, or healthcare providers. You will maintain an ongoing comprehensive understanding of the cloud-computing ecosystem. Your responsibilities will include serving as the technical lead for Salesforce Life Sciences Cloud implementations, including Health Cloud and other related Salesforce products. You will architect and design scalable, secure, and high-performance Salesforce solutions that align with industry best practices and compliance standards (e.g., HIPAA, PHI, GxP). You will define and oversee data models, security architecture, API integrations (FHIR/HL7), and performance optimization strategies. Leading a team of developers, you will drive optimized solutions across multi-cloud implementations, ensuring seamless integration between Health Cloud, Service Cloud, and Experience Cloud and Omnistudio framework. You will lead functional and technical workshops, demonstrating leadership skills in designing, delivering, testing, and deploying. Expertise in User Journey preparations, User Story reviews, Data Modeling, Apex Design Patterns, LWC, and other modern UI techniques will be essential. As a trusted advisor to the client, you will drive conversations with their Enterprise Architects and business partners that shape the architectural vision and establish a program architectural roadmap. Guiding customers, partners, and implementation teams on how best to implement digital transformation with the Salesforce platform using Salesforce Industries will be a key responsibility. Establishing trust with customers" leadership, promoting and/or implementing standard processes with SFI and Salesforce, and building out sophisticated business processes using native Salesforce Industries technology and the toolkit of the platform and integration tools are also critical tasks. You will research, recommend, and implement AppExchange applications and Salesforce upgrades to help meet business needs. Successfully creating custom enterprise applications using Salesforce.com and integrating Salesforce.com with other enterprise systems will be part of your role. Working closely with Delivery Managers, Solution Architects, and directly with clients to architect technology solutions to meet client needs is also expected. Highlighting and leading risk areas in the solution in a proactive manner and committing to seeing an issue through to completion will be essential for this position. Qualifications: - 10+ years of experience in developing technology solutions. - 3+ years of experience in handling client-facing projects in positions of increasing responsibility in the context of systems development and related business consulting. - Expertise in one or multiples of Salesforce Health Cloud, Sales/Service/Experience Cloud, and Vlocity Omnistudio. - Proven experience architecting enterprise-level Salesforce solutions in the life sciences or Veeva product or healthcare industry. - Strong understanding of compliance and regulatory requirements including HIPAA, GxP, PHI, HL7, and FHIR. - Experience in the healthcare domain (Preferred) - Integration Architecture (Must have) - Platform Security (Must have) - Identity and Access Management / Integration Security - Sharing and Visibility (Must have) - Data Architecture and Management (Must have) - Architectural Design Patterns (Must Have) - Apex Design Patterns (Must Have) - Salesforce/Apex, Triggers, Lightning Flows, Lightning, LWC, and experience with modern web and mobile technologies (HTML, CSS, JavaScript, Web Components, others) - Salesforce Certification Preferred (Admin, Developer, Sales, and Service Clouds, Application Architect, System Architect) - OmniStudio Developer/Consultant - Health Cloud Certification (Good to have),

Posted 4 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

About the Team As a part of the DoorDash organization, you will be joining a data-driven team that values timely, accurate, and reliable data to make informed business and product decisions. Data serves as the foundation of DoorDash's success, and the Data Engineering team is responsible for building database solutions tailored to various use cases such as reporting, product analytics, marketing optimization, and financial reporting. By implementing robust data structures and data warehouse architecture, this team plays a crucial role in facilitating decision-making processes at DoorDash. Additionally, the team focuses on enhancing the developer experience by developing tools that support the organization's high-velocity demands. About the Role DoorDash is seeking a dedicated Data Engineering Manager to lead the development of enterprise-scale data solutions. In this role, you will serve as a technical expert on all aspects of data architecture, empowering data engineers, data scientists, and DoorDash partners. Your responsibilities will include fostering a culture of engineering excellence, enabling engineers to deliver reliable and flexible solutions at scale. Furthermore, you will be instrumental in building and nurturing a high-performing team, driving innovation and success in a dynamic and fast-paced environment. In this role, you will: - Lead and manage a team of data engineers, focusing on hiring, building, growing, and nurturing impactful business-focused data teams. - Drive the technical and strategic vision for embedded pods and foundational enablers to meet current and future scalability and interoperability needs. - Strive for continuous improvement of data architecture and development processes. - Balance quick wins with long-term strategy and engineering excellence, breaking down large systems into user-friendly data assets and reusable components. - Collaborate cross-functionally with stakeholders, external partners, and peer data leaders. - Utilize effective planning and execution tools to ensure short-term and long-term team and stakeholder success. - Prioritize reliability and quality as essential components of data solutions. Qualifications: - Bachelor's, Master's, or Ph.D. in Computer Science or equivalent field. - Over 10 years of experience in data engineering, data platform, or related domains. - Minimum of 2 years of hands-on management experience. - Strong communication and leadership skills, with a track record of hiring and growing teams in a fast-paced environment. - Proficiency in programming languages such as Python, Kotlin, and SQL. - Prior experience with technologies like Snowflake, Databricks, Spark, Trino, and Pinot. - Familiarity with the AWS ecosystem and large-scale batch/real-time ETL orchestration using tools like Airflow, Kafka, and Spark Streaming. - Knowledge of data lake file formats including Delta Lake, Apache Iceberg, Glue Catalog, and S3. - Proficiency in system design and experience with AI solutions in the data space. At DoorDash, we are dedicated to fostering a diverse and inclusive community within our company and beyond. We believe that innovation thrives in an environment where individuals from diverse backgrounds, experiences, and perspectives come together. We are committed to providing equal opportunities for all and creating an inclusive workplace where everyone can excel and contribute to our collective success.,

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Lead Cloud Engineer at our organization, you will be responsible for designing and building cloud-based distributed systems to address complex business challenges for some of the world's largest companies. Leveraging your expertise in software engineering, cloud engineering, and DevOps, you will craft technology stacks and platform components that empower cross-functional AI Engineering teams to develop robust, observable, and scalable solutions. Working as part of a diverse and globally distributed engineering team, you will actively engage in the complete engineering life cycle, encompassing the design, development, optimization, and deployment of solutions and infrastructure at a scale that matches the world's leading companies. Your core responsibilities will include: - Architecting cloud solutions and distributed systems for full-stack AI software and data solutions - Implementing, testing, and managing Infrastructure as Code (IAC) for cloud-based solutions, covering areas such as CI/CD, data integrations, APIs, web and mobile apps, and AI solutions - Defining and implementing scalable, observable, manageable, and self-healing cloud-based solutions across AWS, Google Cloud, and Azure - Collaborating with diverse teams, including product managers, data scientists, and other engineers, to deliver analytics and AI features that align with business requirements and user needs - Utilizing Kubernetes and containerization technologies to deploy, manage, and scale analytics applications in the cloud, ensuring optimal performance and availability - Developing and maintaining APIs and microservices to expose analytics functionality to internal and external consumers while adhering to best practices for API design and documentation - Implementing robust security measures to safeguard sensitive data and ensure compliance with data privacy regulations and organizational policies - Monitoring and troubleshooting application performance continuously to identify and resolve issues affecting system reliability, latency, and user experience - Participating in code reviews and contributing to the establishment and enforcement of coding standards and best practices to uphold the quality and maintainability of the codebase - Staying abreast of emerging trends and technologies in cloud computing, data analytics, and software engineering to identify opportunities for enhancing the analytics platform's capabilities - Collaborating closely with business consulting staff and leaders to assess opportunities and develop analytics solutions for clients across various sectors To excel in this role, you should possess the following qualifications: - A Master's degree in Computer Science, Engineering, or a related technical field - At least 6 years of experience, with a minimum of 3 years at the Staff level or equivalent - Proven experience as a cloud engineer and software engineer in product engineering or professional services organizations - Experience in designing and delivering cloud-based distributed solutions, with certifications in GCP, AWS, or Azure considered advantageous - Proficiency in building infrastructure as code using tools such as Terraform (preferred), Cloud Formation, Pulumi, AWS CDK, or CDKTF - Familiarity with software development lifecycle nuances - Experience with configuration management tools like Ansible, Salt, Puppet, or Chef - Proficiency in monitoring and analytics platforms such as Grafana, Prometheus, Splunk, SumoLogic, NewRelic, DataDog, CloudWatch, or Nagios/Icinga - Expertise in CI/CD deployment pipelines (e.g., Github Actions, Jenkins, Travis CI, Gitlab CI, Circle CI) - Hands-on experience in building backend APIs, services, and integrations using Python - Practical experience with Kubernetes through services like GKE, EKS, or AKS considered a plus - Ability to collaborate effectively with internal and client teams and stakeholders - Proficiency in using Git for versioning and collaboration - Exposure to LLMs, Prompt engineering, Langchain considered advantageous - Experience with workflow orchestration tools such as dbt, Beam, Airflow, Luigy, Metaflow, Kubeflow, or others - Proficiency in implementing large-scale structured or unstructured databases, orchestration, and container technologies like Docker or Kubernetes - Strong interpersonal and communication skills to articulate and discuss complex engineering concepts with colleagues and clients from diverse disciplines - Display curiosity, proactivity, and critical thinking in problem-solving - Solid foundation in computer science principles related to data structures, algorithms, automated testing, object-oriented programming, performance complexity, and the impact of computer architecture on software performance - Knowledge of designing API interfaces and data architecture, database schema design, and database scalability - Familiarity with Agile development methodologies If you are seeking a dynamic and challenging opportunity to contribute to cutting-edge projects and collaborate with a diverse team of experts, we invite you to join us at Bain & Company. As a global consultancy dedicated to partnering with change makers worldwide, we are committed to achieving extraordinary results, outperforming the competition, and reshaping industries. With a focus on delivering tailored, integrated solutions and leveraging a network of digital innovators, we strive to drive superior outcomes that endure. Our ongoing investment in pro bono services underscores our dedication to supporting organizations addressing pressing issues in education, racial equity, social justice, economic development, and the environment. Recognized with a platinum rating from EcoVadis, we are positioned in the top 1% of all companies for our environmental, social, and ethical performance. Since our inception in 1973, we measure our success by the success of our clients and maintain the highest level of client advocacy in the industry.,

Posted 4 days ago

Apply

12.0 - 16.0 years

0 Lacs

delhi

On-site

As a Data Architect in our organization, you will play a crucial role in defining the data architecture for key domains within the Data Products Portfolio. Your responsibilities will include evaluating data-related tools and technologies, recommending implementation patterns, and standard methodologies to ensure our Data ecosystem remains modern. Collaborating with Enterprise Data Architects, you will establish and adhere to enterprise standards while conducting PoCs to ensure their implementation. Your expertise will be instrumental in providing technical guidance and mentorship to Data Engineers and Data Analysts, developing and maintaining processes, standards, policies, guidelines, and governance to ensure consistency across the company. You will create and maintain conceptual/logical data models, work with business and IT teams to understand data requirements, and maintain a data dictionary with table and column definitions. Additionally, you will review data models with technical and business audiences and lead the design/build of new models to deliver financial results efficiently to senior management. This role is primarily technical, requiring you to function as an individual contributor (80%) while also demonstrating leadership capabilities (20%). Your key responsibilities include designing, documenting, and training the team on overall processes and process flows for Data architecture, resolving technical challenges in critical situations, developing relationships with external stakeholders, reviewing work from other tech team members, and implementing Data Architecture and Data security policies aligned with governance objectives and regulatory requirements. **Essential Education** - A Bachelor's degree in information science, data management, computer science, or a related field is preferred. **Experience & Qualifications** - Bachelor's degree or equivalent combination of education and experience. - 12+ years of IT experience with a major focus on data warehouse/database related projects. - Expertise in cloud databases like Snowflake/RedShift, data catalog, MDM, etc. - Proficiency in SQL, database procedures, Data Modelling (Conceptual, logical, and Physical), and documenting architecture-related work. - Hands-on experience in data storage, ETL/ELT, data analytics tools and technologies, Data Warehousing design/development, and BI/Analytical systems. - Experience with Cloud Big Data technologies such as AWS, Azure, GCP, and Snowflake. - Experience with Python is preferable. - Strong hands-on experience with data and analytics data architecture, solution design, and engineering. - Experience working with Agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams. - Strong communication and presentation skills for presenting architecture, features, and solution recommendations. You will work closely with global functional portfolio technical leaders, product owners, functional area teams, Global Data portfolio Management & teams, and consulting and internal Data Tribe teams across the organization.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are Kenvue, a company dedicated to the power of everyday care and rooted in a rich heritage and scientific expertise. With iconic brands like NEUTROGENA, AVEENO, TYLENOL, LISTERINE, JOHNSONS, and BAND-AID, you are committed to delivering the best products to customers globally. As a Kenvuer, you are part of a diverse team of 22,000 individuals focused on insights, innovation, and making a positive impact on millions of lives daily. As a Senior Data Modeler at Kenvue Data Platforms, based in Bengaluru, you will collaborate with various teams including Business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) to drive innovative data products for end users. Your role involves developing solution architectures, defining data models, and ensuring the acquisition, ingestion processes, and reporting requirements are met efficiently. Key Responsibilities: - Provide expertise in data architecture and modeling to build next-generation product capabilities that drive business growth. - Collaborate with Business Analytics leaders to translate business needs into optimal architecture designs. - Design scalable and reusable data models adhering to FAIR principles for different functional areas. - Work closely with data engineers, solution architects, and stakeholders to optimize data models. - Create and maintain Metadata Rules, Data Dictionaries, and lineage details for data models. Qualifications: - Undergraduate degree in Technology, Computer Science, or related fields; advanced degree preferred. - Strong interpersonal and communication skills to effectively collaborate with various stakeholders. - 3+ years of experience in data architecture & modeling in Consumer/Healthcare Goods companies. - 5+ years of progressive experience in Data & Analytics initiatives. - Hands-on experience in Cloud Architecture (Azure, GCP, AWS) and cloud-based databases. - Expertise in SQL, Erwin / ER Studio, data modeling techniques, and methodologies. - Familiarity with noSQL, graphDB databases, and data catalogs. - Experience in Agile methodology (Scrum/Kanban) within DevSecOps model. - Proven track record of contributing to high-profile projects with changing requirements. Join Kenvue in shaping the future and making a difference in the world of data and analytics. Proud to be an equal opportunity employer, Kenvue values diversity and inclusion in its workforce. Location: Bangalore, India Job Function: Digital Product Development,

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

Spiro, a global leader in the clean energy sector, is dedicated to redefining efficiency, quality, and innovation at every level. In line with our commitment to expansion and innovation, we are seeking passionate and results-driven professionals to join us on this transformative journey. As we continue to grow, we are looking for experienced Software experts to lead and champion process excellence across multiple countries and functions. This presents an exciting opportunity to drive change, contribute to a culture of continuous improvement, and play a significant role in Spiro's growth and success. As part of our team, you will have access to a range of benefits, including opportunities for continuous learning and professional growth. You will be immersed in a dynamic and collaborative environment that fosters creativity and innovation. Moreover, you will have the chance to lead technological advancements across our global operations. If you are ready to be part of a forward-thinking company and help us elevate Spiro to new heights, we invite you to apply now! Currently, we are looking to fill the position of Data Engineering & Ontology Head. The Data Engineering Head (or Data Engineering Lead/Manager) will be responsible for overseeing a team of data engineers who focus on building, maintaining, and optimizing the infrastructure required for collecting, storing, and analyzing data. Key responsibilities include managing data pipelines, defining data architecture, optimizing data systems, collaborating with stakeholders, and overseeing the implementation of relevant tools like ETL tools and big data technologies. On the other hand, the Data Ontology Head (or Data Ontology Manager) will concentrate on organizing and structuring data through a formal system that defines data relationships and categories. This role involves developing data models to ensure a consistent understanding of data across the organization. Key responsibilities include creating data ontologies, ensuring data governance, managing metadata, collaborating with stakeholders, and establishing data standards and best practices. To excel in these roles, candidates should demonstrate proficiency in big data tools (such as Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and programming languages (Python, Java, Scala, SQL). Leadership skills are essential, including experience in managing teams, setting technical strategy, and driving agile processes. A minimum of 12 years of total experience, with at least 8 years of relevant experience, is required. Additionally, knowledge of data lakes, warehouses, pipeline orchestration tools, and the ability to communicate technical concepts to non-technical stakeholders are crucial for success in these positions.,

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

The Consumer Services Architect role involves supporting the consumer Domain EA in defining the global north star architecture for Marketing & Direct To Consumer domains. This includes establishing capabilities, technology standards, and ensuring alignment with enterprise standards. The role also entails defining architecture for consumer programs, providing guidance to sector EAs for sector-specific initiatives, reducing IT costs and complexity, and ensuring the long-term viability of solutions. Responsibilities include supporting Domain EAs in defining north star reference architectures and technology strategies for the domain. The role involves co-leading the evaluation and selection of solutions/platforms to meet global/sector needs, evangelizing global reference architectures and tech standards, shaping technology strategy for Sales Transformation & Commercial initiatives, and delivering consistent solution architecture for key transformation initiatives. Additionally, the role requires understanding new demand to improve Commercial Services and guiding/supporting sector Commercial architects in defining solution architecture for sector-specific projects. Qualifications for this role include a Bachelor's Degree in Information Technology, Computer Science, MIS, or a similar field (Masters preferred). Candidates should have over 10 years of software industry experience, with at least 5 years of solution architecture experience. In-depth knowledge and practical experience of primary Commercial business processes, B2B sales & order processes, customer loyalty & engagement, and Customer Relationship Management processes are essential. Candidates should also be familiar with data & analytics, integration architecture patterns, key vendor software packages, and have the ability to comprehend business strategies and translate them into technology strategies. Strong thought leadership, executive storytelling skills, and the ability to produce high-quality deliverables are also important for this role.,

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

bhubaneswar

On-site

As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your role will involve collaborating with the team to ensure project progress and providing solutions that align with business needs and application specifications. You are expected to be a subject matter expert (SME) and lead the team in implementing innovative solutions. Key Responsibilities include: - Collaborating and managing the team to perform effectively - Making team decisions and contributing to key decisions across multiple teams - Providing solutions to problems within your team and across various teams - Conducting regular team meetings to ensure project progress - Staying updated on industry trends and technologies Professional & Technical Skills Required: - Proficiency in Stibo Product Master Data Management - Strong understanding of data modeling and data architecture - Experience in data integration and data migration - Hands-on experience in application development and customization - Knowledge of data governance and data quality management Minimum 7.5 years of experience in the field is required, along with a 15 years full-time educational qualification. This position is based in Bhubaneswar.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

kochi, kerala

On-site

As a skilled professional in ETL testing and data warehousing, your primary responsibility will be to design and execute test plans, test cases, and test scripts for ETL processes. You will be tasked with performing data validation and verification to ensure data integrity and accuracy. It will also be your duty to identify, document, and track defects and issues in the ETL processes, collaborating closely with data engineers and developers to troubleshoot and resolve data-related issues. Your role will also involve participating in requirement analysis and providing valuable feedback on data quality and testing requirements. Additionally, you will be expected to generate and maintain test documentation and reports to ensure comprehensive and accurate records. To excel in this position, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field. You should have 4-6 years of experience specifically in ETL testing and data warehousing, with a strong knowledge of ETL tools and processes. Proficiency in SQL and database management systems is essential, along with familiarity with data modeling and data architecture concepts. If you are passionate about ensuring the quality and accuracy of data through meticulous testing processes, and possess the relevant qualifications and experience, we encourage you to apply for this challenging and rewarding opportunity.,

Posted 4 days ago

Apply

12.0 - 20.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen, we understand that your experience and expertise are relevant the current open with our clients. About Technogen : TechnoGen Brief Overview:- TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. LinkedIn: https://www.linkedin.com/company/technogeninc/about/ Job Title :Data Architect & Delivery Lead Required Experience :5+ years Location : Hyderabad. JD summary: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related technical field. 12+ years of experience in data engineering, solution architecture, or enterprise data management roles. 5+ years of experience leading delivery of complex data integration or analytics solutions in a cloud environment (GCP, AWS, or Azure). 5+ years of hands-on experience in data modeling, database design, and architecture (dimensional, normalized, and canonical modeling). Expertise in cloud-native data architecture, particularly with Google Cloud Platform (BigQuery, Cloud Storage, Dataflow, Cloud Composer). Deep understanding of ETL/ELT frameworks, batch and streaming architectures, and orchestration tools (Airflow/Cloud Composer). Proven ability to design dimensional and canonical data models for enterprise analytics and reporting use cases. Hands-on experience with SQL, Python, and modern data engineering toolsets. Solid grasp of data governance, security, quality, and metadata management principles. Strong understanding of MDM solutions and data stewardship principles. Experience managing cross-functional delivery teams in Agile or hybrid delivery environments. Excellent communication and stakeholder management skills. Experience in Retail, Media, or Consumer Products industries is a plus Working knowledge of machine learning/AI, LLMs, or agentic AI data system design is a plus Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn

Posted 4 days ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

The Chief Information & Technology Officer (CITO) role in Lagos, Nigeria involves transforming the organization's technology, data strategy, and information security programs to align with strategic goals. You will lead a team of skilled professionals and foster relationships with internal business units and industry connections. Your responsibilities will include driving technology, data, and information security innovation, maintaining a strong vision, and overseeing daily operations to ensure alignment with financial goals. Key Responsibilities: - Design and drive technology, data, and information security innovation to enhance customer and employee experiences. - Lead revenue generation, customer experience, digital operations, and innovation through technology. - Develop and maintain a strategic roadmap for technology that supports the organization's goals. - Identify appropriate technology platforms for product delivery and service improvement. - Champion disruptive technologies for financial benefits and enhanced experiences. - Optimize cloud technology and legacy systems. - Identify innovative technology opportunities and solutions. - Develop guidelines for technological innovation and cost-efficient solutions. - Manage the enterprise-wide information security program. - Stay updated on technology standards, industry trends, and emerging technologies. - Lead disaster recovery and business continuity planning. - Foster a culture of collaboration within the technology group. - Build a competent team that embraces change and respects industry standards. - Establish a standardized data architecture for organizational insights. - Communicate technology strategy to key stakeholders. - Ensure technology standards and best practices are maintained. Experience & Education: - Minimum of 7 years of experience in transforming customer technology/service experience. - Experience in leading high-performing technology teams, preferably in financial services. - Recent experience in strategic technology planning, data architecture, and information security. - Prior experience in cyber security services, software, or systems. - Bachelor's degree in Computer Science, IT Management, or related field. Skills & Qualifications: - Exceptional strategic vision and analytical skills. - Strong initiative and productivity. - Collaborative interpersonal skills. - Self-driven, results-oriented, and motivated. - Hands-on leadership with team empowerment. - Ability to present complex ideas clearly. - Strong communication skills. - Comfort with disruptive technologies. - Customer service mindset. - Business acumen to interpret strategies into technology needs. To apply, send CVs to careers@worknigeria.com with the job title as the email subject.,

Posted 4 days ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior Product Owner on a contract basis in a hybrid work environment in Chennai, your primary responsibility will be to serve as an agile product owner, ensuring that our software development projects are aligned with customer needs and deliver maximum value while adhering to agile scrum methodologies. You will drive analysis, validation, and verification to determine the necessary data to support business needs, including its creation, reading, updating, and deletion, along with the associated quality criteria. Your role will involve leading the coordination of efforts with staff, vendors, and customers to understand business requirements and design data architecture, solutions, and processes. You will also support the definition of roadmaps and portfolios of change that reflect business strategy and performance objectives. Additionally, you will lead the development of processes, including models such as conceptual, logical, and physical, and deliver customized reports and recommendations to support ongoing business decisions and customer reporting requirements. To excel in this role, you should have a Bachelor's Degree in a relevant discipline or area, along with a minimum of 7 years of work experience as a business analyst or project manager. Possessing Scrum Product Owner certification is preferred. You must demonstrate sound judgment, attention to detail, accuracy, and follow-through on actions, while also adapting flexibly to an ever-changing work environment. Effective communication of complex ideas in a clear and concise manner, both verbally and in writing, across functional and technical departments is essential. Being able to work on multiple tasks simultaneously, handling conflicting demands, prioritizing workloads, and effectively delegating tasks while maintaining high-quality standards is crucial for success in this role. Expertise in quickly grasping the functions and capabilities of new technologies, as well as strong stakeholder management skills to facilitate change delivery in a busy working environment with competing priorities, are also key requirements. Your high emotional intelligence and solid interpersonal and relationship building skills will be instrumental in establishing strong relationships with teams across the business.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The company Coupa, known for making companies operate smarter and grow faster through its leading AI-driven platform, is seeking an experienced BI Lead to join their team in Pune, India. As a BI Lead, you will play a crucial role in bridging the gap between business requirements and data analytics, ensuring the fulfillment of analytics requests using Tableau. The ideal candidate for this role will be advanced in Tableau, possess strong skills in building complex dashboards, and excel in administration tasks. Additionally, you will be expected to build strong relationships with business stakeholders and collaborate effectively with data modelers and data architects. At Coupa, the core values are centered around customer success, focusing on results, and striving for excellence. The company is committed to ensuring customer success through innovation, delivering results with a bias for action, and maintaining a collaborative environment infused with professionalism, integrity, passion, and accountability. Please note that Coupa does not accept inquiries or resumes from recruiters. By submitting your application, you acknowledge that Coupa collects and processes your personal data as per their Privacy Policy for managing recruitment activities. If you are successful in your application, your personal data will be used for employment purposes, and if not successful, you may be notified of future job opportunities. More details about data processing and retention can be found in Coupa's Privacy Policy.,

Posted 4 days ago

Apply

0.0 - 6.0 years

14 - 19 Lacs

Hyderabad

Work from Office

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Technical Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. Drive significant business impact through your capabilities and contributions, and apply deep technical expertise and problem-solving methodologies to tackle a diverse array of challenges that span multiple technologies and applications. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years of applied experience Demonstrated and strong hands on Python/Java Enterprise Web Development; developing in all tiers (middleware, integration and database) of the application and proven experience with design patterns Experience in design and Architecture Experience in AWS (EKS, EC2, S3, EventBridge, StepFunction, SNS/SQS, Lambda) is must Experience in Design and develop scalable, high-performance applications using AWS-native event-driven services, including API Gateway Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed Deep hands-on experience in Django, Flask & Object Oriented methodology of design and development Experience with databases like Amazon RDS, caching and performance tuning, REST APIs, with Messaging (Kafka) Hands on with development and test automation tools/frameworks (e. g. BDD and Cucumber) Experience in best practices for Data Pipeline design, Data architecture and processing of structured and unstructured data. Ability to plan, prioritize and follow through on their work and meet deadlines in a fast-paced environment, while also clearly articulating both technical and non-technical issues with stake holders & partners like Dev Ops, Architects, QA testers & Product Owners Preferred qualifications, capabilities, and skills Experience in Micro services Experience in financial domain is preferred Exposure to artificial intelligence, machine learning, mobile Exposure to agile methodologies such as CI/CD, Applicant Resiliency, and Security Hands-on practical experience in system design, application development, testing, and operational stability We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Technical Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. Drive significant business impact through your capabilities and contributions, and apply deep technical expertise and problem-solving methodologies to tackle a diverse array of challenges that span multiple technologies and applications. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years of applied experience Demonstrated and strong hands on Python/Java Enterprise Web Development; developing in all tiers (middleware, integration and database) of the application and proven experience with design patterns Experience in design and Architecture Experience in AWS (EKS, EC2, S3, EventBridge, StepFunction, SNS/SQS, Lambda) is must Experience in Design and develop scalable, high-performance applications using AWS-native event-driven services, including API Gateway Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed Deep hands-on experience in Django, Flask & Object Oriented methodology of design and development Experience with databases like Amazon RDS, caching and performance tuning, REST APIs, with Messaging (Kafka) Hands on with development and test automation tools/frameworks (e. g. BDD and Cucumber) Experience in best practices for Data Pipeline design, Data architecture and processing of structured and unstructured data. Ability to plan, prioritize and follow through on their work and meet deadlines in a fast-paced environment, while also clearly articulating both technical and non-technical issues with stake holders & partners like Dev Ops, Architects, QA testers & Product Owners Preferred qualifications, capabilities, and skills Experience in Micro services Experience in financial domain is preferred Exposure to artificial intelligence, machine learning, mobile Exposure to agile methodologies such as CI/CD, Applicant Resiliency, and Security Hands-on practical experience in system design, application development, testing, and operational stability

Posted 4 days ago

Apply

8.0 - 17.0 years

45 - 55 Lacs

Mumbai

Work from Office

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Data Platform Engineering Lead at JPMorgan Chase within Asset and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Lead the design, development, and implementation of scalable data pipelines and ETL batches using Python/PySpark on AWS. Execute standard software solutions, design, development, and technical troubleshooting Use infrastructure as code to build applications to orchestrate and monitor data pipelines, create and manage on-demand compute resources on cloud programmatically, create frameworks to ingest and distribute data at scale. Manage and mentor a team of data engineers, providing guidance and support to ensure successful product delivery and support. Collaborate proactively with stakeholders, users and technology teams to understand business/technical requirements and translate them into technical solutions. Optimize and maintain data infrastructure on cloud platform, ensuring scalability, reliability, and performance. Implement data governance and best practices to ensure data quality and compliance with organizational standards. Monitor and troubleshoot application and data pipelines, identifying and resolving issues in a timely manner. Stay up-to-date with emerging technologies and industry trends to drive innovation and continuous improvement. Add to team culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software development and data engineering, with demonstrable hands-on experience in Python and PySpark. Proven experience with cloud platforms such as AWS, Azure, or Google Cloud. Good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts. Experience or good knowledge of cloud native ETL platforms like Snowflake and/or Databricks. Experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3. Proven experience with efficient Cloud DevOps practices and CI/CD tools like Jenkins/Gitlab, for data engineering platforms. Good knowledge of SQL and NoSQL databases, including performance tuning and optimization. Experience with declarative infra provisioning tools like Terraform, Ansible or CloudFormation. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively. Experience in leading and managing a team/pod of engineers, with a proven track record of successful project delivery. Preferred qualifications, capabilities, and skills Knowledge of machine learning model lifecycle, language models and cloud-native MLOps pipelines and frameworks is a plus. Familiarity with data visualization tools and data integration patterns. We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Data Platform Engineering Lead at JPMorgan Chase within Asset and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Lead the design, development, and implementation of scalable data pipelines and ETL batches using Python/PySpark on AWS. Execute standard software solutions, design, development, and technical troubleshooting Use infrastructure as code to build applications to orchestrate and monitor data pipelines, create and manage on-demand compute resources on cloud programmatically, create frameworks to ingest and distribute data at scale. Manage and mentor a team of data engineers, providing guidance and support to ensure successful product delivery and support. Collaborate proactively with stakeholders, users and technology teams to understand business/technical requirements and translate them into technical solutions. Optimize and maintain data infrastructure on cloud platform, ensuring scalability, reliability, and performance. Implement data governance and best practices to ensure data quality and compliance with organizational standards. Monitor and troubleshoot application and data pipelines, identifying and resolving issues in a timely manner. Stay up-to-date with emerging technologies and industry trends to drive innovation and continuous improvement. Add to team culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software development and data engineering, with demonstrable hands-on experience in Python and PySpark. Proven experience with cloud platforms such as AWS, Azure, or Google Cloud. Good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts. Experience or good knowledge of cloud native ETL platforms like Snowflake and/or Databricks. Experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3. Proven experience with efficient Cloud DevOps practices and CI/CD tools like Jenkins/Gitlab, for data engineering platforms. Good knowledge of SQL and NoSQL databases, including performance tuning and optimization. Experience with declarative infra provisioning tools like Terraform, Ansible or CloudFormation. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively. Experience in leading and managing a team/pod of engineers, with a proven track record of successful project delivery. Preferred qualifications, capabilities, and skills Knowledge of machine learning model lifecycle, language models and cloud-native MLOps pipelines and frameworks is a plus. Familiarity with data visualization tools and data integration patterns.

Posted 4 days ago

Apply

6.0 - 12.0 years

14 - 18 Lacs

Chennai

Work from Office

Title: IT Product Owner KBR is a global provider of differentiated, professional services and technologies delivered across a wide government, defense and industrial base. Drawing from its rich 100-year history and culture of innovation and mission focus, KBR creates sustainable value by combining engineering, technical and scientific expertise with its full life cycle capabilities to help our clients meet their most pressing challenges today and into the future. KBR employs approximately 29, 000 people worldwide (including our joint ventures), with customers in more than 80 countries, and operations in 40 countries, across two synergistic global businesses. KBR is looking for an IT Software Development Product Owner to work in our Leatherhead office. DUTIES AND RESPONSIBILITIES: This role acts as a liaison between IT development groups and business units for the development and implementation of new systems and enhancement of current systems. Evaluates new applications, system requirements, developments in field of expertise, and evolving business needs to recommend appropriate solutions and alternatives. Under general direction, uses specialized knowledge or skills to solve complex and unique problems, or direct the daily activities of a business, technical support, or functional support team. Establishes priorities for the completion of assigned tasks. Uses judgment to interpret internal and external issues and develop best practices. May direct resources, prioritize tasks, and provide guidance to less experienced team members. Relies upon experience, interpersonal skills, and broad knowledge of field to ensure task completion in compliance with policies, procedures, and business strategy. Serve as an agile product owner ensuring our software development projects align with customer needs and deliver maximum value while adhering to agile scrum methodologies Drive analysis, validation, and verification to determine what data is required to support business needs, where it is created, read, updated, and deleted and the quality criteria associated with it. Lead in the coordination of effort with staff, vendors, and customers to understand the business requirement to design the data architecture, solutions, and processes. Support the definition of road maps and portfolios of change that reflect business strategy and performance objectives. Lead on the development of processes including models (conceptual, logical, and physical). Delivery of customized reports and recommendations to support on-going business decisions and customer reporting requirements. Establish strong relationships with teams across the business. Required Education and Experience: Bachelor s Degree in a relevant discipline or area with a relevant number of years work experience as a Business Analyst or Project Manager. Scrum Product Owner certified preferred Sound judgement, attention to detail, accuracy, and follow-through on actions including the flexibility to perform in an ever-changing work environment. The ability to communicate complex ideas in a clear, concise manner (verbal and written) across functional and technical departments. Possess the capacity to work on multiple tasks at any one time, handling conflicting demands, prioritize workload and effectively delegate while maintaining high quality standards. Ability to work across swim lanes and deliver results. Expertise in rapidly comprehending the functions and capabilities of new technologies. Stakeholder management skills to facilitate change delivery in a busy working environment with competing day-to-day priorities. High emotional intelligence and solid interpersonal and relationship building skills. KBR Company Information When you become part of the KBR team, your opportunities are endless. Through collaboration with our customers, we re defining tomorrow s challenges, then providing the solutions and services to overcome those challenges, always maintaining our commitment to total safety and reliability. At KBR, we partner with government and industry clients to provide purposeful and comprehensive solutions with an emphasis on efficiency and safety. With a full portfolio of services, proprietary technologies and expertise, our employees are ready to handle projects and missions throughout their entire lifecycle, from planning and design to sustainability and maintenance. Whether at the bottom of the ocean or in outer space, our clients trust us to deliver the impossible on a daily basis. Working at KBR means being rewarded for your contributions. In addition to competitive benefits and professional development, our people are empowered to use all their potential, creating meaningful change for themselves and our clients. We attract the best minds in the world because our expertise thrives on creativity, resourcefulness and collaboration. That is how we supply our clients with cutting-edge solutions and services. As the needs of the world change, we re ready to respond and guide the way forward with strategic, sustainable, and technological advancements grounded in more than a century of practical application and execution. #LI-EF1

Posted 4 days ago

Apply

10.0 - 17.0 years

12 - 17 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

POSITION OVERVIEW: We are seeking an experienced and highly skilled Data Engineer with deep expertise in Microsoft Fabric , MS-SQL, data warehouse architecture design , and SAP data integration. The ideal candidate will be responsible for designing, building, and optimizing data pipelines and architectures to support our enterprise data strategy. The candidate will work closely with cross-functional teams to ingest, transform, and make data (from SAP and other systems) available in our Microsoft Azure environment, enabling robust analytics and business intelligence. KEY ROLES & RESPONSIBILITIES : Spearhead the design, development, deployment, testing, and management of strategic data architecture, leveraging cutting-edge technology stacks on cloud, on-prem and hybrid environments Design and implement an end-to-end data architecture within Microsoft Fabric / SQL, including Azure Synapse Analytics (incl. Data warehousing). This would also encompass a Data Mesh Architecture. Develop and manage robust data pipelines to extract, load, and transform data from SAP systems (e.g., ECC, S/4HANA, BW). Perform data modeling and schema design for enterprise data warehouses in Microsoft Fabric. Ensure data quality, security, and compliance standards are met throughout the data lifecycle. Enforce Data Security measures, strategies, protocols, and technologies ensuring adherence to security and compliance requirements Collaborate with BI, analytics, and business teams to understand data requirements and deliver trusted datasets. Monitor and optimize performance of data processes and infrastructure. Document technical solutions and develop reusable frameworks and tools for data ingestion and transformation. Establish and maintain robust knowledge management structures, encompassing Data Architecture, Data Policies, Platform Usage Policies, Development Rules, and more, ensuring adherence to best practices, regulatory compliance, and optimization across all data processes Implement microservices, APIs and event-driven architecture to enable agility and scalability. Create and maintain architectural documentation, diagrams, policies, standards, conventions, rules and frameworks to effective knowledge sharing and handover. Monitor and optimize the performance, scalability, and reliability of the data architecture and pipelines. Track data consumption and usage patterns to ensure that infrastructure investment is effectively leveraged through automated alert-driven tracking. KEY COMPETENCIES: Microsoft Certified: Fabric Analytics Engineer Associate or equivalent certificate for MS SQL. Prior experience working in cloud environments (Azure preferred). Understanding of SAP data structures and SAP integration tools like SAP Data Services, SAP Landscape Transformation (SLT), or RFC/BAPI connectors. Experience with DevOps practices and version control (e.g., Git). Deep understanding of SAP architecture, data models, security principles, and platform best practices. Strong analytical skills with the ability to translate business needs into technical solutions. Experience with project coordination, vendor management, and Agile or hybrid project delivery methodologies. Excellent communication, stakeholder management, and documentation skills. Strong understanding of data warehouse architecture and dimensional modeling. Excellent problem-solving and communication skills. QUALIFICATIONS / EXPERIENCE / SKILLS Qualifications : Bachelors degree in Computer Science, Information Systems, or a related field. Certifications such as SQL, Administrator, Advanced Administrator, are preferred. Expertise in data transformation using SQL, PySpark, and/or other ETL tools. Strong knowledge of data governance, security, and lineage in enterprise environments. Advanced knowledge in SQL, database procedures/packages and dimensional modeling Proficiency in Python, and/or Data Analysis Expressions (DAX) (Preferred, not mandatory) Familiarity with PowerBI for downstream reporting (Preferred, not mandatory). Experience : • 10 years of experience as a Data Engineer or in a similar role. Skills: Hands-on experience with Microsoft SQL (MS-SQL), Microsoft Fabric including Synapse (Data Warehousing, Notebooks, Spark) Experience integrating and extracting data from SAP systems, such as: o SAP ECC or S/4HANA SAP BW o SAP Core Data Services (CDS) Views or OData Services Knowledge of Data Protection laws across countries (Preferred, not mandatory)

Posted 4 days ago

Apply

5.0 - 10.0 years

35 - 40 Lacs

Bengaluru

Work from Office

As a Senior Data Engineer, you will proactively design and implement data solutions that support our business needs while adhering to data protection and privacy standards. In addition to this, you would also be required to manage the technical delivery of the project, lead the overall development effort, and ensure timely and quality delivery. Responsibilities : Data Acquisition : Proactively design and implement processes for acquiring data from both internal systems and external data providers. Understand the various data types involved in the data lifecycle, including raw, curated, and lake data, to ensure effective data integration. SQL Development : Develop advanced SQL queries within database frameworks to produce semantic data layers that facilitate accurate reporting. This includes optimizing queries for performance and ensuring data quality. Linux Command Line : Utilize Linux command-line tools and functions, such as bash shell scripts, cron jobs, grep, and awk, to perform data processing tasks efficiently. This involves automating workflows and managing data pipelines. Data Protection : Ensure compliance with data protection and privacy requirements, including regulations like GDPR. This includes implementing best practices for data handling and maintaining the confidentiality of sensitive information. Documentation : Create and maintain clear documentation of designs and workflows using tools like Confluence and Visio. This ensures that stakeholders can easily communicate and understand technical specifications. API Integration and Data Formats : Collaborate with RESTful APIs and AWS services (such as S3, Glue, and Lambda) to facilitate seamless data integration and automation. Demonstrate proficiency in parsing and working with various data formats, including CSV and Parquet, to support diverse data processing needs. Key Requirements: 5+ years of experience as a Data Engineer , focusing on ETL development. 3+ years of experience in SQL and writing complex queries for data retrieval and manipulation. 3+ years of experience in Linux command-line and bash scripting. Familiarity with data modelling in analytical databases. Strong understanding of backend data structures, with experience collaborating with data engineers ( Teradata, Databricks, AWS S3 parquet/CSV ). Experience with RESTful APIs and AWS services like S3, Glue, and Lambda Experience using Confluence for tracking documentation. Strong communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels. Ability to work independently and manage multiple tasks and priorities in a dynamic environment. Bachelors degree in Computer Science, Engineering, Information Technology, or a related field. Good to Have: Experience with Spark Understanding of data visualization tools, particularly Tableau. Knowledge of data clean room techniques and integration methodologies.

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies