Jobs
Interviews

1806 Data Architecture Jobs - Page 44

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Sr Data Engineer in the Digital & Data team you will work hands-on to deliver and maintain the pipelines required by the business functions to derive value from their data For this, you will bring data from a varied landscape of source systems into our cloud-based analytics stack and implement necessary cleaning and pre-processing steps in close collaboration with our business customers Furthermore, you will work closely together with our teams to ensure that all data assets are governed according to the FAIR principles To keep the engineering team scalable, you and your peers will create reusable components, libraries, and infrastructure that will be used to accelerate the pace with which future use-cases can be delivered You will be part of a team dedicated to delivering state-of-the-art solutions for enabling data analytics use cases across the Healthcare sector of a leading, global Science & Technology company As such, you will have the unique opportunity to gain insight into our diverse business functions allowing you to expand your skills in various technical, scientific, and business domains Working in a project-based way covering a multitude of data domains and technological stacks, you will be able to significantly develop your skills and experience as a Data Engineer Who you are BE/M.Sc./PhD in Computer Science or related field and 8+ years of work experience in a relevant capacity Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure. Experience with enforcing security controls and best practices to protect sensitive data within AWS data pipelines, including encryption, access controls, and auditing mechanisms. Agile mindset, a spirit of initiative, and desire to work hands-on together with your team Interest in solving challenging technical problems and developing the future data architecture that will enable the implementation of innovative data analytics use-cases Experience in leading small to medium-sized team. Experience in creating architectures for ETL processes for batch as well as streaming Ingestion Knowledge of designing and validating software stacks for GxP relevant contexts as well as working with PII data Familiarity with the data domains covering the Pharma value-chain (e.g. research, clinical, regulatory, manufacturing, supply chain, and commercial) Strong, hands-on experience in working with Python, Pyspark & R codebases, proficiency in additional programming languages (e.g. C/C++, Rust, Typescript, Java, ) is expected. Experience working with Apache Spark and the Hadoop ecosystem Working with heterogenous compute environments and multi-platform setups Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure Basic knowledge of Statistics and Machine Learning algorithms is favorable This is the respective role description: The ability to easily find, access, and analyze data across an organization is key for every modern business to be able to efficiently make decisions, optimize processes, and to create new business models. The Data Architect plays a key role in unlocking this potential by defining and implementing a harmonized data architecture for Healthcare.

Posted 1 month ago

Apply

15.0 - 20.0 years

25 - 32 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Role & responsibilities 15+ years' experience in handling projects ab-initio. He / She must have Strong Technical experience with Microsoft technologies .Net, MS-SQL Server, TFS, Windows Server, BizTalk etc. The candidates should. have strength in technology, domain, and application development and possess leadership qualities to lead a team of minimum 40-50 professionals. Responsibility Areas:- Provide leadership role in the areas of advanced data techniques, including data quality, data governance, data modeling, data access, data integration, data visualization, data discovery,, database design and. implementation; Lead the overall strategy and.. Roadmap for data architecture. Partner with the project organization, solution architecture, and engineering to ensure best use of standards for the key data use cases I patterns tech standards. Analyze Information Technology landscape to identify gaps and recommend improvements. Create and maintain the Enterprise Data Model at the 'Conceptual, Logical and Physical Level: Steward of Enterprise Metadata Architecture & Standards, Data Lifecycle Management including data quality, data conversion, and data. security technologies. Define and achieve the strategy roadmap - for the enterprise data; including data modeling, implementation and data Management for our enterprise data, warehouse and advanced data analytics systems. Develop and document enterprise data standards and provides technical oversight on projects to ensure compliance through the adoption and promotion of industry standards / best practice guiding principles aligned with Gartner, TOGA:F, :Forrester and the like. Create architectural technology and business roadmaps that result in stronger business/VI-alignment and drive adoption and usage of technology across the . enterprise. Align portfolio of projects: to the roadmaps and reference architecture. Define and enforce architecture principles, standards, metrics and policies. Provide leadership in architecture, design and build of complex applications and perform architectural design reviews. Manage the development of transition plans for moving from • the current to the future state environment across application portfolio: Collaborate with both IT and business to influence decisions in technology investments. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Support SYstem.* Analysts, Engineers, Programmers and others on project limitations and capabilities, performance requirements and interfaces. Support modifications to existing software to improve efficiency and performance. Professional Qualification:- B.Tech/ BE/ MCA/ M.Tech/ ME/ Phd in Computer Science/Information. Technology (IT) and related fields or equivalent with consistently good academic record. Preferred Professional Qualification/Certification PMP and- Equivalent, CGEFT, Mt (Foundation), PM Tool, Microsoft certifications MS-SQL, BizTalk, Net Interested candidates share your resume at parul@mounttalent.com/ parul.s@mounttalent.com.

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Chennai

Work from Office

Years of Experience 7+ Years Purpose •The candidate is responsible for designing, creating, deploying, and maintaining an organization's data architecture. •To ensure that the organization's data assets are managed effectively and efficiently, and that they are used to support the organization's goals and objectives. •Responsible for ensuring that the organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Key Responsibilities Responsibilities will include but will not be restricted to: •Responsible for designing and implementing a data architecture that supports the organization's business goals and objectives. •Developing data models, defining data standards and guidelines, and establishing processes for data integration, migration, and management. •Create and maintain data dictionaries, which are a comprehensive set of data definitions and metadata that provide context and understanding of the organization's data assets. •Ensure that the data is accurate, consistent, and reliable across the organization. This includes establishing data quality metrics and monitoring data quality on an ongoing basis. •Organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. •Work closely with other IT professionals, including database administrators, data analysts, and developers, to ensure that the organization's data architecture is integrated and aligned with other IT systems and applications. •Stay up to date with new technologies and trends in data management and architecture and evaluate their potential impact on the organization's data architecture. •Communicate with stakeholders across the organization to understand their data needs and ensure that the organization's data architecture is aligned with the organization's strategic goals and objectives. Technical requirements •Bachelor's or master’s degree in Computer Science or a related field. •Certificates in Database Management will be preferred. •Expertise in data modeling and design, including conceptual, logical, and physical data models, and must be able to translate business requirements into data models. •Proficient in a variety of data management technologies, including relational databases, NoSQL databases, data warehouses, and data lakes. •Expertise in ETL processes, including data extraction, transformation, and loading, and must be able to design and implement data integration processes. •Experience with data analysis and reporting tools and techniques and must be able to design and implement data analysis and reporting processes. •Familiar with industry-standard data architecture frameworks, such as TOGAF and Zachman, and must be able to apply them to the organization's data architecture. •Familiar with cloud computing technologies, including public and private clouds, and must be able to design and implement data architectures that leverage cloud computing. Qualitative Requirements •Able to effectively communicate complex technical concepts to both technical and non-technical stakeholders. •Strong analytical and problem-solving skills. •Must be able to inspire and motivate their team to achieve organizational goal. Following skills can be deemed good to have but not necessary: Databricks, Snowflake, Redshift, Data Mesh, Medallion, Lambda

Posted 1 month ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. — You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. — Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years"™ experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment

Posted 1 month ago

Apply

12.0 - 15.0 years

45 - 55 Lacs

Bengaluru

Work from Office

Join us as a Solution Designer Take on a varied role, where you ll own the end-to-end high-level business design for a project, programme or initiative You ll be working with a range of stakeholders to identify investment priorities, define opportunities and shape journeys to meet our strategic goals This is a chance to shape the future of our business and gain great exposure across the bank in the process Were offering this role at vice president level What youll do As a Solution Designer, you ll engage with relevant stakeholders as a single point of contact for design aspects. You ll be representing the design function at governance forums and working with enterprise architects to make sure standards and principles are adhered to. You ll also analyse requirements into coherent end-to-end designs, taking the business architecture into account. Other duties include: Translating requirements into a series of transition state designs and an executable roadmap Partnership with technology and data to develop a data product roadmap to support customer and reference data outcomes Documenting the relevant design in accordance with standard methods Designing systems and processes supporting data quality issue management across customer and reference data optimising for data quality remediation where possible The skills youll need You ll already have a background in solution design and experience of minimum ten years of using industry standard models and tools. Alongside good communication skills, you ll also need the ability to lead and collaborate with both internal and external teams. We ll also want to see: Knowledge of cloud data practices and data architecture A broad understanding of d ata lakehouse solutions like SageMaker in implementing effective data management practices Creative skills to design solutions to support the bank wide simplification program for customer and reference data Hours 45 Job Posting Closing Date: 01/07/2025

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python, SQL, Spark (PySpark) ,

Posted 1 month ago

Apply

12.0 - 17.0 years

15 - 20 Lacs

Gurugram, Bengaluru

Work from Office

Join us as a Solution Architect This is an opportunity for an experienced Solution Architect to help us define the high level technical architecture and design for your assigned scope that provides solutions to deliver great business outcomes and meets our longer term strategy You ll define and communicate a shared technical and architectural vision of end-to-end designs that may span multiple platforms and domains Take on this exciting new challenge and hone your technical capabilities while advancing your career and building your network across the bank Were offering this role at vice president level What youll do We ll look to you to influence and promote the collaboration across platform and domain teams on the solution delivery. Partnering with platform and domain teams, you ll elaborate the solution and its interfaces, validating technology assumptions, evaluating implementation alternatives, and creating the continuous delivery pipeline. You ll also provide analysis of options and deliver end-to-end solution designs using the relevant building blocks, as well as producing designs for features that allow frequent incremental delivery of customer value. On top of this, you ll be: Owning the technical design issues and driving resolution through the iteration of the technical solution design Participating in activities to shape requirements, validating designs and prototypes to deliver change that aligns with the target architecture Promoting adaptive design practices to drive collaboration of feature teams around a common technical vision using continuous feedback Making recommendations of potential impacts to existing and prospective customers of the latest technology and customer trends The skills youll need As a Solution Architect, you ll bring expert knowledge of application architecture, and in business data or infrastructure architecture with working knowledge of industry architecture frameworks such as TOGAF or ArchiMate. You ll also need an understanding of Agile and contemporary methodologies with experience of at least 12 years of working in Agile teams. On top of this, you ll bring: Experience of data engineering and designing solutions that involve complex data supply chains and platforms A background in delivering solutions that securely span a complex infrastructure domain Experience in AWS and diagramming using tools such as Draw.io and MS Visio Knowledge and understanding of the key concepts in data management and data architecture The ability to communicate complex technical concepts clearly to peers and leadership level colleagues Hours 45 Job Posting Closing Date: 01/07/2025

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages English: C1 Advanced Location - Pune,Bangalore,Hyderabad,Chennai,Noida

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Should have 4+ yrs of MDM development & implementation experience. Reltio MDM development OR any other MDM development (preferentially Reltio), SQL. Must have core skills experience with Informatica MDM Hands-on experience with Informatica MDM hub configurations, data modelling, data mappings, and data validation. Well-versed with best practices driven design and development including match rule tuning, strong ability to understand, document, and communicate technical architectures, standards, and toolsets. Knowledge of creating set-up security for applications. Providing data architecture solutions, interpreting business requirements, and converting them into technical requirements. Defining use cases and testing scenarios. Collaborating with source systems data stewards, system owners, and technical personnel for data governance. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance

Posted 1 month ago

Apply

5.0 - 10.0 years

2 - 6 Lacs

Pune

Work from Office

Job Title: Support Specialist - Eagle Platform (Portfolio Management) Location: Riyadh, Saudi Arabia Type: Full-time / Contract Industry: Banking / Investment Management / FinTech Experience Required: 5+ years We are seeking a highly skilled Support Specialist with hands-on experience working on BNY Mellon s Eagle Investment Systems , particularly the Eagle STAR, PACE, and ACCESS modules used for portfolio accounting, data management, and performance reporting . The ideal candidate will have supported the platform in banking or asset management environments, preferably with experience at Bank of America , BNY Mellon , or institutions using Eagle for middle- and back-office operations . Key Responsibilities: Provide day-to-day technical and functional support for the Eagle Platform including STAR, PACE, and Performance modules Troubleshoot and resolve user issues related to portfolio accounting, performance calculation, and reporting Act as a liaison between business users and technical teams for change requests, data corrections, and custom reports Monitor batch jobs, data feeds (security, pricing, transaction data), and system interfaces Work closely with front-office, middle-office, and operations teams to ensure accurate data processing and reporting Manage SLA-driven incident resolution and maintain support documentation Support data migrations, upgrades, and new release rollouts of Eagle components Engage in root cause analysis and implement preventive measures Required Skills and Experience: 5+ years of experience in financial systems support, with a strong focus on Eagle Investment Systems Strong knowledge of portfolio management processes , NAV calculations , and financial instruments (equities, fixed income, derivatives) Prior work experience in Bank of America , BNY Mellon , or with asset managers using Eagle is highly preferred Proficient in SQL , ETL tools , and understanding of data architecture in financial environments Familiarity with upstream/downstream systems such as Bloomberg, Aladdin, or CRD is a plus Strong analytical skills and attention to detail Excellent communication skills in English (Arabic is a plus) Preferred Qualifications: Bachelor s degree in Computer Science, Finance, or related field ITIL Foundation or similar certification in service management Prior experience working in a banking or asset management firm in the GCC is a bonus

Posted 1 month ago

Apply

12.0 - 17.0 years

10 - 13 Lacs

Pune

Work from Office

You will provide effective leadership by influence in a manner that is consistent with the Roche values and leadership commitments. You will seek to inspire and influence teams to create transformative solutions ensuring that Roche products are recognised as being the best in the industry and maintain our #1 ranking in the future. Reporting to the Technology Architecture Subchapter Lead in the Architecture, Technology and Standards Office, you will primarily partner with the rest of Engineering across Roche to deliver customer centric solutions. You will be responsible Responsible to technically lead a large scale product portfolio or a specific clinical domain, defining the technology strategy to sustainably deliver a product line Act as a subject matter expert for their domain / area, mentoring and guiding the team Responsible for developing and delivering new designs, including identifying and assessing technology options (build, buy, partner) Accountable for common architecture roadmap and onboarding, common assets oversight, Toolkit and Integration APIs Influence and engage internal customers across Roche and drive fast and consistent adoption of the reference architecture Publicize and develop a collaboration model for reference architecture adoption at all levels of the organization Responsible for standardization, Tools, App Development Environments Proactively takes on improvement initiatives and leads process improvements (Agile, QMS alignment) Able to scale strategically to take on multiple projects, align across Roche portfolio, build and drive a technology roadmap, manage cross-functional and internal (Roche) and external stakeholders Your profile BS degree or equivalent in a directly related discipline (CS, Eng, etc.) 12+ years of previous software development/architecture experience Have successfully built, deployed, and supported an enterprise-scale (web) application in the cloud (in a leadership role) Deep healthcare experience in at least 1 healthcare domain, with breadth across several Experience in the role of an architect, leading a technical team, designing large software systems, and operating at the engineering management team level Hands-on software development experience Quick learner with the ability to understand complex workflows and develop and validate innovative solutions to solve difficult problems Good communicator, able to talk with stakeholders and customers to explain technology and with the proven ability to take insights from customers and translate them into technical deliverables Proven ability to establish and articulate a vision, set goals, develop and execute strategies, and track and measure results Proven experience leading software teams through collaborative technological innovation in an agile environment or continuous improvement efforts that have yielded tangible results and/or positive impact for patients or business stakeholders Highly developed people influencing skills; Demonstrated success in establishing a high performing environment, with an excellent reputation attracting the best talent and the commitment to developing and inspiring them Proven ability to create and sustain strong collaborative relationships and networks with diverse stakeholders across a complex global organization Familiarity with the technological trends and their relevance to the healthcare industry. A passionate and decisive business leader. Demonstrating courage, vision and drive to achieve results at the forefront of innovative technological changes Locations You will be based in Pune, India. At the Companys discretion, an exception to the location requirement could be made under extraordinary circumstances. As this position is a global role, international business travel will be required depending upon the business location of the successful candidate and ongoing business project activities. Roche is strongly committed to a diverse and inclusive workplace. We strive to build teams that represent a range of backgrounds, perspectives, and skills. Embracing diversity enables us to create a great place to work and to innovate for patients. Roche is an equal opportunity employer.

Posted 1 month ago

Apply

3.0 - 7.0 years

6 - 15 Lacs

Pune, Chennai

Hybrid

Company Description: Volante is on the Leading Edge of Financial Services technology, if you are interested to be on an Innovative fast- moving team that leverages the very best in Cloud technology our team may be right for you. By joining the product team at Volante, you will have an opportunity to shape the future of payments technology, with focus on payment intelligence. We are a financial technology business that provides a market leading, cloud native Payments Processing Platform to Banks and Financial institutions globally. Education Criteria: • B.E, MSc, M.E/MS in Computer Science or similar major. Relevant certification courses from reputed organization. Experience of 3+ years as a Data Engineer Responsibilities: • The role involves design and development of scalable solutions, payment analytics unlocking operational and business insight • Own data modeling, building ETL pipelines and enabling data driven metrics • Build and optimize data models for our application needs • Design & develop data pipelines and workflows that integrate data sources (structured, unstructured data) across the payment landscape • Assess customer's data infrastructure landscape (payment ancillary systems including Sanctions, Fraud, AML) across cloud environments like AWS, Azure as well as on-prem, for deployment design • Lead the enterprise application data architecture design, framework & services plus Identify and enable the services for SaaS environment in Azure and AWS • Implement customizations and data processing required to transform customer datasets that is needed for processing in our analytics framework/BI models • Monitor data processing, machine learning workflows to ensure customer data is successfully processed by our BI models, debugging and resolving any issues faced along the way • Optimize queries, warehouse, data lake costs • Review and provide feedback on Data Architecture Design Document/HLD for our SaaS application • Cross team collaboration to successfully integrate all aspects of the Volante PaaS solution • Mentor to the development team Skills: • 3+ years of data engineering experience data collection, preprocessing, ETL processes and Analytics • Proficiency in data engineering Architecture, Metadata management, Analytics, reporting and database administration • Strong in SQL/NoSQL, Python, JSON and data warehousing/data lake , orchestration, analytical tools • ETL or pipeline design & implementation of large data • Experience with data technologies, frameworks like Databricks, Synapse, Kafka, Spark, Elasticsearch • Knowledge of SCD, CDC, core data warehousing to develop a cost-effective, secure data collection, storage, and distribution of data for SaaS application • Experience in application deployment in AWS or Azure w/container, Kubernetes • Strong problem-solving skills and passion for building data at scale Job Description Engineering Skills (Desirable): • Knowledge of data visualization tools like Tableau • ETL Orchestration tools like Airflow and visualization tools like Grafana • Prior experience in Banking or Payments domain Location: India (Pune or Chennai)

Posted 1 month ago

Apply

9.0 - 13.0 years

32 - 40 Lacs

Ahmedabad

Remote

About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.

Posted 1 month ago

Apply

2.0 - 6.0 years

0 - 1 Lacs

Pune

Work from Office

As Lead Data Engineer , you'll design and manage scalable ETL pipelines and clean, structured data flows for real-time retail analytics. You'll work closely with ML engineers and business teams to deliver high-quality, ML-ready datasets. Responsibilities: Develop and optimize large-scale ETL pipelines Design schema-aware data flows and dashboard-ready datasets Manage data pipelines on AWS (S3, Glue, Redshift) Work with transactional and retail data for real-time insights

Posted 1 month ago

Apply

6.0 - 7.0 years

22 - 25 Lacs

Pune

Remote

Engage stakeholders, architect data products, guide dev teams, build & own ETL pipelines (PULSE, Snowflake DV2), ensure data quality/governance, agile leadership. "Passport mandatory." contact to 9063478484/v.aparna@tekgenieservices.com

Posted 1 month ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Ahmedabad, Gurugram

Work from Office

Hi, Wishes from GSN! Pleasure connecting with you. About the job: This is GCP Data Engineer opportunity with a leading bootstrapped product company, a valued client of GSN HR . Job Title: GCP Data Engineer Experience: 4+ Years Work Loc : Ahmedabad Work Mode : WFO - 5 Days in Office Work Timing : General CTC Range : 20 LPA to 25 LPA Job Summary: We are seeking a GCP Data Enginner professional , to join a high-impact QA team working on mission-critical banking systems. Key Responsibilities: Proficiency in Python for data processing and scripting. S trong SQL knowledge and experience with relational databases (e.g., MySQL, PostgreSQL, SQL Server) Understanding of data modelling, data warehousing, and data architecture. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services . Proficiency in working with GCP (especially Big Query and GCS). Version control skills using Git. If interested, click Apply now for IMMEDIATE response. Best, DIVYA GSN HR | divya@gsnhr.net | 9994042152 | Google review : https://g.co/kgs/UAsF9W

Posted 1 month ago

Apply

5.0 - 10.0 years

19 - 30 Lacs

Hyderabad

Work from Office

For Data Engineer Years of experience -3-5 years Number of openings-2 For Sr. Data Engineer Years of experience- 6-10 years Number of openings-2 About Us Logic Pursuits provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, Logic Pursuits is a game-changer in any operations strategy. Job Description We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities Design and build robust ELT pipelines using dbt on Snowflake, including ingestion from relational databases, APIs, cloud storage, and flat files . Reverse-engineer and optimize SAP Data Services (SAP DS) jobs to support scalable migration to cloud-based data platforms . Implement layered data architectures (e.g., staging, intermediate, mart layers) to enable reliable and reusable data assets. Enhance dbt/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design. Use orchestration tools like Airflow, dbt Cloud, and Control-M to schedule, monitor, and manage data workflows. Apply modular SQL practices, testing, documentation, and Git-based CI/CD workflows for version-controlled, maintainable code. Collaborate with data analysts, scientists, and architects to gather requirements, document solutions, and deliver validated datasets. Contribute to internal knowledge sharing through reusable dbt components and participate in Agile ceremonies to support consulting delivery. Required Qualifications Data Engineering Skills 3–5 years of experience in data engineering, with hands-on experience in Snowflake and basic to intermediate proficiency in dbt. Capable of building and maintaining ELT pipelines using dbt and Snowflake with guidance on architecture and best practices. Understanding of ELT principles and foundational knowledge of data modeling techniques (preferably Kimball/Dimensional) . Intermediate experience with SAP Data Services (SAP DS), including extracting, transforming, and integrating data from legacy systems. Proficient in SQL for data transformation and basic performance tuning in Snowflake (e.g., clustering, partitioning, materializations). Familiar with workflow orchestration tools like dbt Cloud, Airflow, or Control M . Experience using Git for version control and exposure to CI/CD workflows in team environments. Exposure to cloud storage solutions such as Azure Data Lake, AWS S3, or GCS for ingestion and external staging in Snowflake. Working knowledge of Python for basic automation and data manipulation tasks. Understanding of Snowflake's role-based access control (RBAC) , data security features, and general data privacy practices like GDPR. Data Quality & Documentation Familiar with dbt testing and documentation practices (e.g., dbt tests, dbt docs). Awareness of standard data validation and monitoring techniques for reliable pipeline development. Soft Skills & Collaboration Strong problem-solving skills and ability to debug SQL and transformation logic effectively. Able to document work clearly and communicate technical solutions to a cross-functional team. Experience working in Agile settings, participating in sprints, and handling shifting priorities. Comfortable collaborating with analysts, data scientists, and architects across onshore/offshore teams. High attention to detail, proactive attitude, and adaptability in dynamic project environments. Nice to Have Experience working in client-facing or consulting roles. Exposure to AI/ML data pipelines or tools like feature stores and MLflow Familiarity with enterprise-grade data quality tools Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus Additional Information Why Join Us? Opportunity to work on diverse and challenging projects in a consulting environment. Collaborative work culture that values innovation and curiosity. Access to cutting-edge technologies and a focus on professional development. Competitive compensation and benefits package. Be part of a dynamic team delivering impactful data solutions Required Qualification Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)

Posted 1 month ago

Apply

12.0 - 20.0 years

35 - 50 Lacs

Bengaluru

Hybrid

Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA

Posted 1 month ago

Apply

10.0 - 17.0 years

50 - 75 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role: Presales Senior Cloud Data Architect (with Data Warehousing Experience) Employment Type: Full-Time Professional Summary: Onix is seeking an experienced Presales Senior Cloud Data Architect with a strong background in data warehousing and cloud platforms to play a pivotal role in the presales lifecycle and solution design process. This position is key to architecting scalable, secure, and cost-efficient data solutions that align with client business objectives. The ideal candidate will have deep expertise in data architecture, modeling, and cloud data platforms such as AWS and GCP , combined with the ability to lead and influence during the presales engagement phase. Scope / Level of Decision Making: This is an exempt position operating under limited supervision , with a high degree of autonomy in presales technical solutioning, client engagement, and proposal development. Complex decisions are escalated to the manager as necessary. Primary Responsibilities: Presales & Solutioning Responsibilities: Engage early in the sales cycle to understand client requirements, gather technical objectives, and identify challenges and opportunities. Partner with sales executives to develop presales strategies , define technical win themes, and align proposed solutions with client needs. Lead the technical discovery process , including stakeholder interviews, requirement elicitation, gap analysis, and risk identification. Design comprehensive cloud data architecture solutions , ensuring alignment with business goals and technical requirements. Develop Proofs of Concept (PoCs) , technical demos, and architecture diagrams to validate proposed solutions and build client confidence. Prepare and deliver technical presentations , RFP responses, and detailed proposals for client stakeholders, including C-level executives. Collaborate with internal teams (sales, product, delivery) to scope solutions , define SOWs, and transition engagements to the implementation team. Drive technical workshops and architecture review sessions with clients to ensure stakeholder alignment. Cloud Data Architecture Responsibilities: Deliver scalable and secure end-to-end cloud data solutions across AWS, GCP, and hybrid environments. Design and implement data warehouse architectures , data lakes, ETL/ELT pipelines, and real-time data streaming solutions. Provide technical leadership and guidance across multiple client engagements and industries. Leverage AI/ML capabilities to support data intelligence, automation, and decision-making frameworks. Apply cost optimization strategies , cloud-native tools, and best practices for performance tuning and governance. Qualifications: Required Skills & Experience: 8+ years of experience in data architecture , data modeling , and data management . Strong expertise in cloud-based data platforms (AWS/GCP), including data warehousing and big data tools. Proficient in SQL, Python , and at least one additional programming language (Java, C++, Scala, etc.). Knowledge of ETL/ELT pipelines , CI/CD , and automated delivery systems . Familiarity with NoSQL and SQL databases (e.g., PostgreSQL, MongoDB). Excellent presentation, communication, and interpersonal skills especially in client-facing environments. Proven success working with C-level executives and key stakeholders . Experience with data governance , compliance, and security in cloud environments. Strong problem-solving and analytical skills . Ability to manage multiple initiatives and meet tight deadlines in a fast-paced setting. Education: Bachelors degree in Computer Science, Information Systems, or related field (or equivalent experience required). Travel Expectation: Up to 15% for client engagements and technical workshops.

Posted 1 month ago

Apply

5.0 - 9.0 years

13 - 17 Lacs

Pune

Work from Office

Diacto is looking for a highly capable Data Architect with 5 to 9 years of experience to lead cloud data platform initiatives with a primary focus on Snowflake and Azure Data Hub. This individual will play a key role in defining the data architecture strategy, implementing robust data pipelines, and enabling enterprise-grade analytics solutions. This is an on-site role based in our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design and implement enterprise-level data architecture with a strong focus on Snowflake and Azure Data Hub Define standards and best practices for data ingestion, transformation, and storage Collaborate with cross-functional teams to develop scalable, secure, and high-performance data pipelines Lead Snowflake environment setup, configuration, performance tuning, and optimization Integrate Azure Data Services with Snowflake to support diverse business use cases Implement governance, metadata management, and security policies Mentor junior developers and data engineers on cloud data technologies and best practices Experience and Skills Required: 5?9 years of overall experience in data architecture or data engineering roles Strong, hands-on expertise in Snowflake, including design, development, and performance tuning Solid experience with Azure Data Hub and Azure Data Services (Data Lake, Synapse, etc.) Understanding of cloud data integration techniques and ELT/ETL frameworks Familiarity with data orchestration tools such as DBT, Airflow, or Azure Data Factory Proven ability to handle structured, semi-structured, and unstructured data Strong analytical, problem-solving, and communication skills Nice to Have: Certifications in Snowflake and/or Microsoft Azure Experience with CI/CD tools like GitHub for code versioning and deployment Familiarity with real-time or near-real-time data ingestio Why Join Diacto Technologies Work with a cutting-edge tech stack and cloud-native architectures Be part of a data-driven culture with opportunities for continuous learning Collaborate with industry experts and build transformative data solutions

Posted 1 month ago

Apply

5.0 - 9.0 years

14 - 17 Lacs

Pune

Work from Office

Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers

Posted 1 month ago

Apply

5.0 - 8.0 years

11 - 12 Lacs

Bengaluru

Work from Office

Total Experience 58 years with 4+ years of relevant experience Skills Proficiency on Databricks platform Strong handson experience with Pyspark , SQL, and Python Any cloud Azure, AWS, GCP Certifications (Any of the following) Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Mandatory Skill Sets Databricks,Pyspark, SQL,Python, Any cloud Azure, AWS, GCP Preferred Skill Sets Related Ceritfication Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Education Qualification BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Mumbai

Work from Office

Looking for a savvy Data Engineer to join team of Modeling / Architect experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as we'll as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company s data architecture to support our next generation of products and data initiatives.This role requires a flexible working schedule, including potential weekend support for critical operations, while maintaining a 40-hour work week. In this role, you will assist in maintaining the MDLZ DataHub Google BigQuery data pipelines and corresponding platforms (on-prem and cloud), working closely with global teams on DataOps initiatives. The D4GV platform spans across three key GCP instances: NALA, MEU, and AMEA, supporting the global rollout of o9 across all Mondel z BUs over the next three years 5+ years of overall industry experience and minimum of 2-4 years of experience building and deploying large scale data processing pipelines in a production environment Focus on excellence: Has practical experience of Data-Driven Approaches, Is familiar with the application of Data Security strategy, Is familiar with we'll know data engineering tools and platforms Technical depth and breadth : Able to build and operate Data Pipelines, Build and operate Data Storage, Has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices to that made on their projects. Implementation and automation of Internal data extraction from SAP BW / HANA Implementation and automation of External data extraction from openly available internet data sources via APIs Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR Data ingestion and management in Hadoop / Hive Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases Exposing data via Alteryx, SQL Database for consumption in Tableau Data documentation maintenance/update Collaboration and workflow using a version control system (eg, Git Hub) Learning ability : Is self-reflective, Has a hunger to improve, Has a keen interest to drive their own learning. Applies theoretical knowledge to practice Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Skills and Experience Deep knowledge in manipulating, processing, and extracting value from datasets; support the day-to-day operations of these GCP-based data pipelines, ensuring data governance, reliability, and performance optimization. Hands-on experience with GCP data services such as Dataflow, BigQuery, Dataproc, Pub/Sub, and real-time streaming architectures is preferred. + 5 years of experience in data engineering, business intelligence, data science, or related field; Proficiency with Programming Languages: SQL, Python, R Spark, PySpark, SparkR, SQL for data processing; Strong project management skills and ability to plan and prioritize work in a fast-paced environment; Experience with: MS Azure Data Factory, MS Azure Data Lake Store, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau; Ability to think creatively, highly-driven and self-motivated; Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, OpenHubs) No Relocation support available

Posted 1 month ago

Apply

14.0 - 20.0 years

35 - 45 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Data Architect: • Design and implement enterprise data models, ensuring data integrity, consistency, and scalability Responsibilities: Design and implement enterprise data models, ensuring data integrity, consistency, and scalability. Analyse business needs and translate them into technical requirements for data storage, processing, and access. In-memory Cache: Optimizes query performance by storing frequently accessed data in memory. Query Engine: Processes and executes complex data queries efficiently. Business Rules Engine (BRE): Enforces data access control and compliance with business rules. Select and implement appropriate data management technologies, including databases, data warehouses. Collaborate with data engineers, developers, and analysts to ensure seamless integration of data across various systems. Monitor and optimize data infrastructure performance, identifying and resolving bottlenecks. • Stay up-to-date on emerging data technologies and trends, recommending and implementing solutions. Document data architecture and processes for clear communication and knowledge sharing, including the integration. Qualifications: • Proven experience in designing and implementing enterprise data models. • Expertise in SQL and relational databases (e.g., Oracle, MySQL, PostgreSQL). • Experience with cloud-based data platforms (e.g., AWS, Azure, GCP) is mandatory. • Working experience with ETL tools and data ingestion leveraging any real-time solutions (e.g., Kafka, streaming) is required • Strong understanding of data warehousing concepts and technologies. • Familiarity with data governance principles and best practices. • Excellent communication, collaboration, and problem-solving skills. • Ability to work independently and as part of a team. • Strong analytical and critical thinking skills. • Experience with data visualization & UI Development is a plus. • Bachelors degree in computer science, Information Technology, or a related fiel

Posted 1 month ago

Apply

10.0 - 15.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Position: Integration Architect Job Description: What you will be doing: The Enterprise Architect contributes to enterprise architecture by ensuring that common architecture decisions are implemented consistently across business and IT in order to support the business and IT strategy. They will also research, analyze, design, propose, and deliver IT architecture solutions that are optimal for the business and IT strategies in one or more domains. The Enterprise Architect provides integrated systems analysis and recommends appropriate hardware, software, and communication links required to support IT goals and strategy. The Enterprise Architect is responsible for analyzing and incorporating where appropriate industry trends, and being familiar with enterprise standards and methodology. The role will also have familiarity with the enterprise infrastructure and applications, and will specialize in a particular domain. Contribute to the definition of conceptual and logical architecture specifications (e.g., data architecture, application architecture, technical architecture) for the enterprise. Interact with business strategists and comprehend design impact to systems and develop measurable business cases that support the Architecture. Serve as an architecture conduit for external requests and queries, and provide education on enterprise architecture directions and goals. Manage organizational impacts of architecture. Work with other architects on enterprise architectural efforts, utilizing cross-functional knowledge (strategy, change management and business process management). Analyze IT industry and market trends and determine potential impact upon enterprise as well as, identify and analyze enterprise business drivers to derive enterprise business, information, technical, and solution architecture requirements. Implement overall architecture approach for all layers of a solution. Ensure that enterprise architecture standards, policies, and procedures are enacted uniformly across application development projects and programs. What we are looking for : Experience / Education Typically requires a minimum of 10 years of related experience with a 4 year degree; or 8 years and an advanced degree; or equivalent experience. Arrow Electronics, Inc. (NYSE: ARW), an award-winning Fortune 133 and one of Fortune Magazine s Most Admired Companies. Arrow guides innovation forward for over 220,000 leading technology manufacturers and service providers. With 2023 sales of USD $33.11 billion, Arrow develops technology solutions that improve business and daily life. Our broad portfolio that spans the entire technology landscape helps customers create, make and manage forward-thinking products that make the benefits of technology accessible to as many people as possible. Learn more at www.arrow.com . Our strategic direction of guiding innovation forward is expressed as Five Years Out, a way of thinking about the tangible future to bridge the gap between whats possible and the practical technologies to make it happen. Learn more at https://www.fiveyearsout.com/ . Location: IN-KA-Bangalore, India (SKAV Seethalakshmi) GESC Time Type: Full time Job Category: Information Technology

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies