Jobs
Interviews

1779 Data Architecture Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

karnataka

On-site

You are a Data Solution Lead with over 10 years of experience in Data Governance, Data Modeling, Data Architecture, and Data Lineage, particularly within the BFSI sector. Your primary responsibilities include collaborating with business stakeholders to gather and analyze data requirements, designing enterprise data models, ensuring seamless data integration, implementing data governance policies, metadata management, and data lineage tracking. Additionally, you will be responsible for developing data catalogs, business glossaries, and dictionaries, as well as improving data quality, compliance, and automation in data processes. To excel in this role, you must possess expertise in Data Governance, Data Modeling, and Architecture, along with strong SQL and data migration experience. Knowledge of the BFSI domain is preferred. Excellent stakeholder management and communication skills are crucial for effective collaboration, and the ability to automate data processes will contribute to enhancing efficiency within the organization.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You should have experience in understanding and translating data, analytic requirements, and functional needs into technical requirements while collaborating with global customers. Your responsibilities will include designing cloud-native data architectures to support scalable, real-time, and batch processing. You will be required to build and maintain data pipelines for large-scale data management in alignment with data strategy and processing standards. Additionally, you will define strategies for data modeling, data integration, and metadata management. Your role will also involve having strong experience in database, data warehouse, and data lake design and architecture. You should be proficient in leveraging cloud platforms such as AWS, Azure, or GCP for data storage, compute, and analytics services. Experience in database programming using various SQL flavors is essential. Moreover, you will need to implement data governance frameworks encompassing data quality, lineage, and cataloging. Collaboration with cross-functional teams, including business analysts, data engineers, and DevOps teams, will be a key aspect of this role. Familiarity with the Big Data ecosystem, whether on-premises (Hortonworks/MapR) or in the Cloud, is required. You should be able to evaluate emerging cloud technologies and suggest enhancements to the data architecture. Proficiency in any orchestration tool like Airflow or Oozie for scheduling pipelines is preferred. Hands-on experience in utilizing tools such as Spark Streaming, Kafka, Databricks, and Snowflake is necessary. You should be adept at working in an Agile/Scrum development process and optimizing data systems for cost efficiency, performance, and scalability.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate for the Azure Data Engineer position will have 4-6 years of experience in designing, implementing, and maintaining data solutions on Microsoft Azure. As an Azure Data Engineer at our organization, you will be responsible for designing efficient data architecture diagrams, developing and maintaining data models, and ensuring data integrity, quality, and security. You will also work on data processing, data integration, and building data pipelines to support various business needs. Your role will involve collaborating with product and project leaders to translate data needs into actionable projects, providing technical expertise on data warehousing and data modeling, as well as mentoring other developers to ensure compliance with company policies and best practices. You will be expected to maintain documentation, contribute to the company's knowledge database, and actively participate in team collaboration and problem-solving activities. We are looking for a candidate with a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Data Engineer focusing on Microsoft Azure. Proficiency in SQL and experience with Azure data services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics is required. Strong understanding of data architecture, data modeling, data integration, ETL/ELT processes, and data security standards is essential. Excellent problem-solving, collaboration, and communication skills are also important for this role. As part of our team, you will have the opportunity to work on exciting projects across various industries like High-Tech, communication, media, healthcare, retail, and telecom. We offer a collaborative environment where you can expand your skills by working with a diverse team of talented individuals. GlobalLogic prioritizes work-life balance and provides professional development opportunities, excellent benefits, and fun perks for its employees. Join us at GlobalLogic, a leader in digital engineering, where we help brands design and build innovative products, platforms, and digital experiences for the modern world. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology.,

Posted 3 weeks ago

Apply

6.0 - 14.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You should have a solid experience of 6 to 14 years in Data Modeling/ ER Modeling. As a candidate for this position, you should possess knowledge of relational databases and data architecture computer systems, including SQL. It is preferred that you have familiarity with or a bachelors degree in computer science, data science, information technology, or data modeling. In this role, you will be expected to have a good understanding of ER modeling, big data, enterprise data, and physical data models. Additionally, experience with data modeling software such as SAP PowerDesigner, Microsoft Visio, or Erwin Data Modeler would be beneficial. The job location is in Coimbatore, and there will be a walk-in interview scheduled on 12th April. If you meet the requirements and are passionate about data modeling and ER modeling, we encourage you to apply for this exciting opportunity.,

Posted 3 weeks ago

Apply

3.0 - 6.0 years

18 - 30 Lacs

Mumbai

Work from Office

Hello Connections, Greetings from Teamware Solutions !! We are #Hiring for Top Investment Bank Position: Data Analyst Location: Mumbai Experience Range: 3 to 6 Years Notice Period: Immediate to 30 Days Must have skills: data Analyst , data catalog, Data analytics, data governance along with collibra. What You'll Do - As part of the Data & Analytics Group (DAG) and reporting to the Head of the India DAG locally, the individual is responsible for the following 1. Review, analyze, and resolve data quality issues across IM Data Architecture 2. Coordinate with data owners and other teams to identify root-cause of data quality issues and implement solutions. 3. Coordinate the onboarding of data from various internal / external sources into the central repository. 4. Work closely with Data Owners/Owner delegates on data analysis and development data quality (DQ) rules. Work with IT on enhancing DQ controls. 5. End-to-end analysis of business processes, data flow s, and data usage to improve business productivity through re-engineering and data governance. 6. Manage change control process and participate in user acceptance testing (UAT) activities. What Were Looking For 1. Minimum 3- 6 years experience in data analysis, data catalog & Collibra. 2. Experience in data analysis and profiling using SQL is a must 3. Knowledge in coding, Python is a plus 4. Experience in working with cataloging tools like Collibra 5. Experience working with BI reporting tools like Tableau, Power BI is preferred. Preferred Qualifications: 1. Bachelors Degree required and any other relevant academic course a plus. 2. Fluent in English Apply now : francy.s@twsol.com

Posted 3 weeks ago

Apply

5.0 - 10.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Minimum qualifications: Bachelor s degree or equivalent practical experience. 5 years of experience with software development in one or more programming languages, and with data structures/algorithms. 3 years of experience testing, maintaining, or launching software products, and 1 year of experience with software design and architecture. 3 years of experience with front-end and user interface development. Preferred qualifications: Master's degree or PhD in Computer Science or related technical field. 1 year of experience in a technical leadership role. Experience in developing accessible technologies. About the job Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google s needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward.At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Refer Data and AAML team under Enterprise Data and Engineering is responsible for Data Architecture, Data Engineering and Analytics, AI and ML technologies that empower our partners in engineering and business to deliver critical corporate functions. Deliver solutions to meet the data, reporting and analytics needs of Googlers. Drive high impact projects to deliver data management and investigative solutions for our partners across Google. Create and maintain logical and physical database designs. Ensure the integrity of data under the purview of the projects, including establishing security procedures to protect and maintain the highest level of confidentiality and data security.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

16 - 20 Lacs

Bengaluru

Work from Office

The Associate Business Architect should have the ability to provide broad and deep technical leadership and cohesive architecture approaches for strategic efforts. Technical depth in customer facing systems from an application, infrastructure, operations, scalability, availability, performance, people, process, and tooling perspective. The Associate Business Architect is responsible for the analysis and design activities on cornerstone / enterprise or large projects as part of Baxter s Enterprise IT DATA organization. He / She will identify, evaluate, develop and /or redesign systems and procedures that meet user requirements. He / She is responsible for aiding business customers with business process redesign on large projects. He / She must see the potential for data and compelling visualizations to drive the organization to greater performance. Meet each stakeholder "where they are" in a range of data literacy, understanding both the users literal request and also inferring the core goal behind that request. Critical Responsibilities: Develop solutions architecture and evaluate architectural alternatives for public and hybrid cloud models, including SaaS, IaaS, PaaS, and other cloud services in a large global environment that supports both business and customer needs. Hands on Dimension modeling experience Ability to provide broad and deep technical leadership and cohesive architecture approaches for strategic efforts. Technical depth in customer facing systems from an application, infrastructure, operations, scalability, availability, performance, people, process, and tooling perspective. Work with a global team to gather requirements and implement solutions and work with Data Architecture and Data Engineering teams to design solutions and deploy applications in the cloud. Design and develop solutions based on a strategic business and technical requirements by selecting the appropriate services based on data, compute, database, or security requirements. Identifying appropriate use of Cloud architectural best practices. Estimate costs and identify cost control mechanisms. Technical leadership in the full life cycle of the software environment. Working knowledge of software development tools and methodologies. Experience migrating or transforming legacy customer solutions to the cloud Capability to provide best practices for building secure and reliable applications on the AWS platform as well as to define deployment and operational procedures for applications and solutions that maximize the value of cloud services in support of Baxter requirements. Drive scope definition, requirements analysis, functional and technical design, application build, product configuration, unit testing, and production deployment and ensure delivered solutions meet/perform to technical and functional/non-functional requirements. Provide support and technical governance, expertise related to cloud architectures, deployment, and operations. Act as the coach and mentor to team members and technical staff on their assigned project tasks Lead the definition and development of cloud reference architecture and management systems Conduct design reviews with team members Key Experiences & Attributes: Thorough understanding of system design, analysis, and development required. Strong knowledge of business process flow and/or entity relationship modeling. Must be effective in a team with strong problem solving, critical thinking skills, influencing, communication, and presentation skills. Self-starter with history of balancing multiple priorities simultaneously with the ability to adapt to the changing needs of the business while meeting deadlines Demonstrated outstanding written and verbal communication skills. Ability to work effectively with other technology teams as well as business partners. Ability to effectively present information, interact with, and respond to questions from managers, employees, customers, and vendors Detail-oriented team player who can consistently provide valuable suggestions and solutions in areas of analytical solutions with conceptual thinking to ensure all parts of an application function together as intended. Strong experience in data curation / aggregation / preparation for use in a Visualization tool (i.e. Cognos, Tableau) 12+ years related work experience (with an emphasis on Data and Solution architecture, accurate data preparation, effective data visualizations and the ability to train others) 8+ years as an architect and distilling data into meaningful insights with hands-on expertise on AWS and Snowflake. Demonstrated success analyzing data to solve business problems and identify opportunities for improving performance Ability to prioritize and work on multiple tasks simultaneously Hands on Dimension modeling experience with 2+ yrs

Posted 3 weeks ago

Apply

3.0 - 9.0 years

16 - 20 Lacs

Bengaluru

Work from Office

We are looking for a highly motivated and experienced IT Enterprise Architect (f/m/d) with a strong focus on end-to-end (E2E) customer service processes. You will play a key role in shaping and aligning our IT landscape across platforms such as SAP, ServiceNow, and other customer service-related systems. Your expertise will help drive the digital transformation of our global service processes, ensuring scalability, resilience, and excellent customer experience. Your tasks and responsibilities: You are responsible for enterprise architecture management (including business IT alignment and application portfolio analysis) and the derivation of IT strategies from business requirements. You design and maintain the end-to-end Enterprise Architecture for all customer service processes,and supporting processes (egs. spare parts managment, returns management technician skill matching etc.). You Lead cross-functional workshops and architecture communities to align business goals with IT strategy You will drive the development of the architecture framework, the architecture roadmap and the application and data architecture for the end-to-end customer service business process. You guide the selection and integration of platforms such as SAP S/4HANA, SAP Sales Cloud, Salesforce, Oracle Sales Cloud, and ServiceNow etc. You model IT architectures and processes and drive the consistent design planning and implementation of IT solutions. You contribute to solution evaluations, RFI/RFP processes, and vendor selection in the customer service space You coordinate communication with all key decision-makers and relevant stakeholders and advise them on the development of the IT landscape You drive documentation and presentations to ensure executive alignment Your qualifications and experience: You have a degree in computer science, industrial engineering or a comparable qualification You have experience as an Enterprise Architect or Solution-/Domain Architect in Customer facing IT landscapes You are familar with enterprise architecture methods and frameworks, governance structures, IT Service Management Frameworks (egs. TOGAF, Zachman, ITIL etc.). You bring functional or IT implementation experience across all customer service processes and functions (installation and maintenance, customer service, field service, material logistics and finance etc.) You have experience in the implementation of customer service solutions (e.g. ServiceNow, Salesforce, SAP Service Cloud, SAP Field Service Management, Oracle Sales Cloud, CPQ, Spryker etc.) You have extensive experience with data architecture and integration concepts and a very good understanding of cloud technologies (e.g. Azure, AWS) You have gained practical experience with enterprise architecture tools such as BizzDesign, LeanIX or Avolution and have good knowledge of modeling and managing business processes Your attributes and skills: In addition, you have sound technological know-how and several years of experience in complex technology landscapes We require a very good command of English, both spoken and written, for cooperation with specialist departments in Germany and abroad. Ideally, you also have a very good command of German You are an organizational talent and impress with good communication and presentation skills You are a team player with strong interpersonal skills who can operate confidently in a global environment We do not compromise on quality - you work in a results and quality-oriented manner with a high level of commitment and have good analytical and conceptual skills You are flexible in thought and action, have a quick understanding and constructive assertiveness

Posted 3 weeks ago

Apply

16.0 - 18.0 years

50 - 60 Lacs

Gurugram, Bengaluru

Work from Office

Join us as a Data Engineer We re looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If you re ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you Were offering this role at associate vice president level What you ll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development. You ll also provide transformation solutions and carry out complex data extractions. We ll expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions. You ll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills you ll need To be successful in this role, you ll have an understanding of data usage and dependencies with wider teams and the end customer. You ll also have experience of extracting value and features from large scale data. We ll expect you to have at least seven years of experience in ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities. You ll also need: Experience of designing enterprise applications using code in AWS or Azure cloud platforms Experience in Database technologies covering Postgres and SQL Server Knowledge of Data Modelling and Data transformation Good knowledge of modern code development practices using Java, React, .net and .net Core Great communication skills with the ability to proactively engage with a range of stakeholders Hours 45 Job Posting Closing Date: 15/07/2025

Posted 3 weeks ago

Apply

15.0 - 20.0 years

20 - 25 Lacs

Bengaluru

Work from Office

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Customer Success Job Details About Salesforce . The SF Data Cloud Architect plays a critical role within Salesforce s Professional Services team, assisting in pre-sales and leading the design and implementation of enterprise-grade Data Management solutions. This position is responsible for architecting scalable solutions across enterprise landscapes using Data Cloud, ensuring data is ready for enterprise AI, applying data governance guardrails as well as supporting enterprise analytics and automation. This role covers ANZ, ASEAN and India markets. The ideal candidate brings deep expertise in data architecture, project lifecycle and Salesforce ecosystem knowledge. Combined with strong soft skills, stakeholder engagement and technical writing ability. You will collaborate with cross-functional teams to shape the future of customers data ecosystem and enable data excellence at scale Key Responsibilities * Salesforce Data Cloud Trusted Advisor: Be a trusted SME for Data Cloud to support and lead project delivery and customer engagements during pre-sales cycle. Including how Data Cloud relates to the success of AI * Architecture Support: Provide Data and System Architecture guidance to Salesforce Account teams and Customers by reviewing proposed architecture. Also peer review project effort estimates, scope and delivery considerations * Project Delivery: Work on cross-cloud project delivery and lead the data and analytics stream, spear-heading Data Cloud Design & Delivery. Ability to work collaboratively with cross-functional teams from Developer to Executive * Data Architecture: Design and guide customers enterprise data architecture aligned to business goals. Highlight the importance of Data Ethics and Privacy by ensuring that customer solutions adhere to relevant regulations and best practices in data security and privacy * Data Cloud Enablement: Lead Data Cloud architecture enablement for key domains and cross cloud teams * Analytics Support: Collaborate with analytics and AI teams to ensure data readiness for advanced analytics, reporting, and AI/ML initiatives * Stakeholder Engagement: Work cross-functionally across multiple Salesforce teams and projects to deliver aligned and trusted data solutions. Facilitate and influence Executive Customer stakeholders while aligning technology strategy to business value and ROI. Build strong relationships with both internal and external teams, contributing to broader goals and growth * Documentation: Create and maintain high-quality architecture blueprints, design documents, standards, and technical guidelines Technical Skills * 15+ years in data architecture or consulting, with solution design and project delivery experience * Deep knowledge in MDM, Data Distribution and Data Modelling concepts * Expertise in data modelling with strong understanding of metadata and lineage * Experience in executing data strategies, landscape architecture assessments and proof-of-concepts * Excellent communication, stakeholder management and presentation skills * Strong technical writing and documentation ability * Basic understanding of Hadoop spark fundamentals is an advantage * Understanding of Data Platforms (example: Snowflake, DataBricks, AWS, GCP, MS Azure) * Experience with tools such as Salesforce Data Cloud, or similar enterprise Data platforms - Hands on deep Data Cloud experience is a strong plus * Working knowledge of enterprise data warehouse, data lake and data hub concepts * Strong understanding of Salesforce Products and functional domains such as Technology, Finance, Telco, Manufacturing and Retail is a positive Expected Qualification * Salesforce Certified Data Cloud Consultant - Highly Preferred * Salesforce Data Architect - Preferred * Salesforce Application Architect - Preferred * AWS Spark/ DL, Az DataBricks, Fabric, Google Cloud, Snowflakes, or similar - Preferred

Posted 3 weeks ago

Apply

15.0 - 20.0 years

30 - 35 Lacs

Bengaluru

Work from Office

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Customer Success Job Details About Salesforce The SF Data Cloud Architect plays a critical role within Salesforce s Professional Services team, assisting in pre-sales and leading the design and implementation of enterprise-grade Data Management solutions. This position is responsible for architecting scalable solutions across enterprise landscapes using Data Cloud, ensuring data is ready for enterprise AI, applying data governance guardrails as well as supporting enterprise analytics and automation. This role covers ANZ, ASEAN and India markets. The ideal candidate brings deep expertise in data architecture, project lifecycle and Salesforce ecosystem knowledge. Combined with strong soft skills, stakeholder engagement and technical writing ability. You will collaborate with cross-functional teams to shape the future of customers data ecosystem and enable data excellence at scale Key Responsibilities * Salesforce Data Cloud Trusted Advisor: Be a trusted SME for Data Cloud to support and lead project delivery and customer engagements during pre-sales cycle. Including how Data Cloud relates to the success of AI * Architecture Support: Provide Data and System Architecture guidance to Salesforce Account teams and Customers by reviewing proposed architecture. Also peer review project effort estimates, scope and delivery considerations * Project Delivery: Work on cross-cloud project delivery and lead the data and analytics stream, spear-heading Data Cloud Design & Delivery. Ability to work collaboratively with cross-functional teams from Developer to Executive * Data Architecture: Design and guide customers enterprise data architecture aligned to business goals. Highlight the importance of Data Ethics and Privacy by ensuring that customer solutions adhere to relevant regulations and best practices in data security and privacy * Data Cloud Enablement: Lead Data Cloud architecture enablement for key domains and cross cloud teams * Analytics Support: Collaborate with analytics and AI teams to ensure data readiness for advanced analytics, reporting, and AI/ML initiatives * Stakeholder Engagement: Work cross-functionally across multiple Salesforce teams and projects to deliver aligned and trusted data solutions. Facilitate and influence Executive Customer stakeholders while aligning technology strategy to business value and ROI. Build strong relationships with both internal and external teams, contributing to broader goals and growth * Documentation: Create and maintain high-quality architecture blueprints, design documents, standards, and technical guidelines Technical Skills * 15+ years in data architecture or consulting, with solution design and project delivery experience * Deep knowledge in MDM, Data Distribution and Data Modelling concepts * Expertise in data modelling with strong understanding of metadata and lineage * Experience in executing data strategies, landscape architecture assessments and proof-of-concepts * Excellent communication, stakeholder management and presentation skills * Strong technical writing and documentation ability * Basic understanding of Hadoop spark fundamentals is an advantage * Understanding of Data Platforms (example: Snowflake, DataBricks, AWS, GCP, MS Azure) * Experience with tools such as Salesforce Data Cloud, or similar enterprise Data platforms - Hands on deep Data Cloud experience is a strong plus * Working knowledge of enterprise data warehouse, data lake and data hub concepts * Strong understanding of Salesforce Products and functional domains such as Technology, Finance, Telco, Manufacturing and Retail is a positive Expected Qualification * Salesforce Certified Data Cloud Consultant - Highly Preferred * Salesforce Data Architect - Preferred * Salesforce Application Architect - Preferred * AWS Spark/ DL, Az DataBricks, Fabric, Google Cloud, Snowflakes, or similar - Preferred Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form . Posting Statement

Posted 3 weeks ago

Apply

6.0 - 11.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Req ID: 331245 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr .NET/GCP Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Once You Are Here, You Will: The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. Additionally, you will collaborate with teams and support emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge / support for applications development, integration, and maintenance as well as providing input to department and project teams on decisions supporting projects. Basic Qualifications: 6+ years developing in .Net/.Net Core 3+ years of experience with developing microservices and using SOLID Principles 3+ years of Rest API development 2+ years of experience working with Databases and writing stored procedures 2+ year of unit and service testing with frameworks such as xunit, Nunit, etc. 2+ year of cloud develolpment experience with GCP i.e. Pub Sub, cloud function etc. . 1+ year of Angular development Preferred: Experience with CI/CD tooling i.e. Jenkins, Azure Devops etc Experience with containerization technologies e.g. Docker, Kubernetes GCP experience Ideal Mindset: Lifelong Learner: You are always seeking to improve your technical and nontechnical skills. Team Player: You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator: You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Include if in India: Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST *This position is not open to employer sponsorship** Launchjobs #LaunchEngineering #LI-NorthAmerica

Posted 3 weeks ago

Apply

4.0 - 8.0 years

11 - 15 Lacs

Kolkata

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary Responsibilities Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloudbased data warehousing and ETL. Maintain best practice standards for the development or cloudbased data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the endtoend data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and handson development in cloudbased analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of DevOps processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory skill sets ADE, ADB, ADF Preferred skill sets ADE, ADB, ADF Years of experience required 48 Years Education qualification BE, B.Tech, MCA, M.Tech d Education Degrees/Field of Study required Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills ADF Business Components, ADL Assistance, Android Debug Bridge (ADB) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 3 weeks ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Chennai

Remote

Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization

Posted 3 weeks ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Bengaluru

Work from Office

HashiCorp, an IBM Company solves development, operations, and security challenges in infrastructure so organizations can focus on business-critical tasks. We build products to give organizations a consistent way to manage their move to cloud-based IT infrastructures for running their applications. Our products enable companies large and small to mix and match AWS, Microsoft Azure, Google Cloud, and other clouds as well as on-premises environments, easing their ability to deliver new applications. At HashiCorp, we have used the Tao of HashiCorp as our guiding principles for product development and operate according to a strong set of company principles for how we interact with each other. We value top-notch collaboration and communication skills, both among internal teams and in how we interact with our users. What you’ll do (responsibilities) As a Senior Engineer, you will contribute to the development, operation, and enhancement of cloud offerings. With at least 8 years of experience in software engineering, cloud computing, and operational excellence, you will play a vital role in ensuring our managed services are reliable, scalable, and secure, meeting the sophisticated needs of our global customer base. Contribute to the architecture, development, and scaling of dataplane services, ensuring best practices in cloud service delivery. Implement robust monitoring and alerting frameworks to guarantee high availability and performance of our managed services. Work closely with product teams, platform teams, and security to align development efforts, enhance product integrations, and ensure a cohesive user experience. Leverage feedback from customers and support teams to drive product enhancements, focusing on reducing operational toil and improving service usability. Partner with the security team to fortify service security and comply with regulatory standards, maintaining HashiCorp’s reputation for trust and reliability. Stay at the forefront of cloud technologies and practices, advocating for and implementing solutions that enhance our managed service offerings. Required education Bachelor's Degree Required technical and professional expertise 8+ years of software engineering experience, with a focus on cloud services and infrastructure. Proficiency in Go, with experience in other languages (Python, Ruby etc) considered a plus. Extensive knowledge of cloud computing platforms (AWS, Azure, GCP) and experience with infrastructure as code (Terraform). Demonstrated ability in developing and managing production-grade cloud services, with a strong understanding of operational reliability and security practices. Excellent problem-solving skills, with the ability to collaborate effectively across diverse teams. Passionate about improving operational processes and delivering customer-centric solutions. Preferred technical and professional experience Familiarity with HashiCorp and IBM products and services.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Mumbai

Work from Office

As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Configure Datastax Cassandra as per requirement of the project solution Design the database system specific to Cassandra in consultation with the data modelers, data architects and etl specialists as well as the microservices/ functional specialists. Thereby produce an effective database system in Cassandra according to the solution & client's needs and specifications. Interface with functional & data teams to ensure the integrations with other functional and data systems are working correctly and as designed. Participate in responsible or supporting roles in different tests or UAT that involve the DataStax Cassandra database. The role will also need to ensure that the Cassandra database is performing and error free. This will involve troubleshooting errors and performance issues and resolution of the same as well as plan for further database improvement. Ensure the database documentation & operation manual is up to date and usable Preferred technical and professional experience Has expertise, experience and deep knowledge in the configuration, design, troubleshooting of NoSQL server software and related products on Cloud, specifically DataStax Cassandra. Has knowledge/ experience in other NoSQl/ Cloud database. Installs, configures and upgrades RDBMS or NoSQL server software and related products on Cloud

Posted 3 weeks ago

Apply

10.0 - 15.0 years

7 - 12 Lacs

Bengaluru

Work from Office

As a Senior zOS System Programmer /Lead Development Engineer you will be involved in developing automation solutions to provision and manage any infrastructure across your organization. Being developer you will be leveraging capabilities of Terrafrom and Cloud offering to drive infrastructure as code capabilities for the IBM z/OS platform. You will Work closely with frontend engineers as part of a full-stack team, collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhancing the experience of the offering. Required education Bachelor's Degree Required technical and professional expertise 10+ years of Software development experience with zOS or zOS Sub-systems. * 8+ years Professional experience developing with Golang, Python and Ruby * Hands-on experience with z/OS system programming or administration experience * Experience with Terraform key features like Infrastructure as a code, change automation, auto scaling. * Experience working with cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. * Cloud-native mindset and solid understanding of DevOps principles in a cloud environment * Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. * Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). * Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. * Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. * Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. * Strong analytical, debugging and problem solving skills to analyse issues and defects reported by customer-facing and test teams. * Proficient in source control management tools (GitHub, ) and with Agile Life Cycle Management tools. * Soft Skills: Strong communication, collaboration, self-organization, self-study, and the ability to accept and respond constructively to critical feedback. Preferred technical and professional experience

Posted 3 weeks ago

Apply

15.0 - 20.0 years

20 - 35 Lacs

Chennai, Bengaluru, Thiruvananthapuram

Hybrid

Data Architect Exp: 15+ Yrs Location: Bengaluru, Chennai, TVM Role Overview We are seeking a highly experienced Data Architect with deep expertise in designing and managing largescale, complex data architectures, along with exposure to Machine Learning (ML) and Artificial Intelligence (AI). This role is key to driving our data strategy, integrating advanced AI/ML capabilities into data ecosystems, and supporting high-stakes proposals for global clients. The ideal candidate combines technical depth with strategic vision, bridging traditional data systems with next-generation AI-driven solutions to ensure scalability, security, and innovation Key Responsibilities • Enterprise Data Architecture: Lead the design and implementation of comprehensive data architectures for large, complex organizations, ensuring systems are robust, eGicient, and meet business needs. • End-to-End Data Solution Management: Architect and manage end-to-end data solutions, from data ingestion and processing through storage, analytics, and reporting, ensuring high performance and data integrity . • Strategic Proposal Support: Collaborate with sales and business development teams to support RFPs and client proposals, oGering data architecture expertise that drives competitive advantage and innovation. • Complex Data Projects Oversight: Oversee complex data initiatives, making architectural decisions for data modeling, ETL, data warehousing, big data, and cloud data platforms. • Technology Innovation & Best Practices: Identify, evaluate, and implement emerging data technologies to enhance architectural strategy, keeping solutions cutting-edge and cost-eGective. • Stakeholder Engagement & Collaboration: Act as a key liaison with business and technical stakeholders, translating data architecture strategies into actionable solutions that align with organizational goals and regulatory standards Technical Skills • Data Architecture & Modelling: Advanced knowledge of data modelling techniques (conceptual, logical, physical), database design, and normalization theory. Expertise in relational and NoSQL databases, especially with systems like SQL Server, Oracle, PostgreSQL, MongoDB, Cassandra, DynamoDB, or similar. Proficiency in ERD/logical data design tools such as ERwin, PowerDesigner, or DBeaver . • ETL & Data Integration: Extensive experience with ETL/ELT tools such as Informatica , Talend , Apache Nifi, or DBT (Data Build Tool) . Strong understanding of data integration platforms and real-time data processing (e.g. Apache Kafka, AWS Glue, Azure Data Factory, Databricks). • Big Data & Cloud Platforms: Experience with big data ecosystems (Hadoop, Spark, Hive) and stream processing systems (Kafka Streams , Flume , etc.). Strong understanding and hands-on experience with cloud platforms (AWS , Google Cloud , Azure ) and their database services (e.g., Redshift , BigQuery , Snowflake , or Azure Synapse ). Data Warehousing & BI Tools: Experience building and maintaining data warehouses, data marts, and OLAP cubes. Knowledge of Business Intelligence tools such as Tableau, Power BI, Looker, or QlikView. Familiarity with data lake architectures and cataloging frameworks for hybrid data storage solutions (e.g. S3, ADLS, Delta Lake) Security & Compliance: Strong understanding of data security , encryption, tokenization, and access control best practices. Familiarity with governance frameworks and data compliance regulations (e.g., GDPR, CCPA, HIPAA, and SOX) to implement secure architecture and standardized procedures • Programming & Scripting: Strong scripting and programming skills in Python, Java, Scala, or similar languages. Proficiency in SQL for querying databases and performing advanced data analysis Qualifications • Experience: 1. 15+ years in data architecture with a strong emphasis on large-scale, complex enterprise data solutions. 2. Significant experience in supporting sales and proposal development with a focus on high-value, strategic engagements. • Education: Bachelors or Master’s degree in Computer Science, Data Science, Information Technology, or a related field. Advanced certifications (e.g., AWS Certified Big Data Specialty, Azure Solutions Architect Expert) are highly desirable. • Certifications: 1. Certifications in cloud platforms like AWS Certified Big Data Architect , Microsoft Azure Data Architect , or similar are a plus 2. TOGAF, CDMP (Certified Data Management Professional), or other relevant data architecture/management certifications are added advantages

Posted 4 weeks ago

Apply

8.0 - 10.0 years

5 - 15 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Job Title: Azure Data Architect Experience: 8 to 10 years Location: Pan India Employment Type: Full-Time Notice period : Immediate to 30 days Technology: SQL, ADF, ADLS, Synapse, Pyspark, Databricks, data modelling Key Responsibilities: Requirement gathering and analysis Design of data architecture and data model to ingest data Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Hands on experience on Azure functions and other components like realtime streaming etc Oversee Azure billing processes, conducting analyses to ensure cost-effectiveness and efficiency in data operations. Provide optimized solution for any problem related to data engineering Ability to work with verity of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables

Posted 4 weeks ago

Apply

7.0 - 12.0 years

22 - 37 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Mumbai & Bengaluru - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 710 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation

Posted 4 weeks ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Location: Bangalore/Hyderabad/Pune Experience level: 7+ Years About the Role We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field. 8+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 4 weeks ago

Apply

15.0 - 20.0 years

20 - 30 Lacs

Noida, Gurugram

Hybrid

Design architectures using Microsoft SQL Server MongoDB Develop ETLdata lakes, Integrate reporting tools like Power BI, Qlik, and Crystal Reports to data strategy Implement AWS cloud services,PaaS,SaaS, IaaS,SQL and NoSQL databases,data integration

Posted 4 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Noida

Work from Office

As a Sr. Data Engineer, you will lead the design, development, and management across investments to platform and data pipelines. You will identify, monitor, and lead initiatives to ensure our data platform remains scalable, reliable, and efficient in light of the evolving data requirements of our products and services. You will work closely with solution experts to design, iterate, and develop key pipelines to unlock new solution functionality, analytical insights, and machine-learning features. You will be adept in partnering with cross-functional partners and data users to translate needs into technical solutions and leading the technical scoping, implementation, and general execution of improvements to our solutions and platform. You will be data curious and excited to have an impact on the team and in the company and to improve the quality of healthcare operations. Key Responsibilities Spearheaded the discovery, evaluation, and integration of new datasets, collaborating (incl. pipeline development and data modeling/documentation) working closely with key data stakeholders to understand their impact and relevance to our core products and the healthcare domain. Translate product / analytical vision into highly functional data pipelines supporting high-quality & highly trusted data products (incl. designing data structures, building and scheduling data transformation pipelines, improving transparency etc). Set the standard for data engineering practices within the company, guiding the architectural approaches, data pipeline designs, and the integration of cutting-edge technologies to foster a culture of innovation and continuous improvement. Key Qualifications Excellent cross-functional communication - the ability to break down complex technical components for technical and non-technical partners alike Innate aptitude for interpreting complex datasets with demonstrated ability to discern underlying patterns, identify anomalies, and extract meaningful insights, demonstrating advanced data intuition and analytical skills. (Healthcare experience preferred) Excellence in quality data pipeline design, development, and optimization to create reliable, modular, secure data foundations for the organizations data delivery system from applications to analytics & ML Proven ability to independently handle ambiguous project requirements and lead data initiatives from start to finish, while collaborating extensively with cross-functional, non-technical teams to inform and shape product development. Nice to Have Skills 5+ years of experience designing, building, and operating cloud-based, highly available, observable, and scalable data platforms utilizing large, diverse data sets in production to meet ambiguous business needs Relevant industry certifications in a variety of Data Architecture services (SnowPro Advanced Architect, Azure Solutions Architect Expert, AWS Solutions Architect / Database, Databricks Data Engineer / Spark / Platform etc) Experience with MLOps and/or developing and maintaining machine learning models and infrastructure Experience with data visualization tools and analytics technologies (Looker, Tableau, etc) Degree in Computer Science, Engineering, or related field

Posted 4 weeks ago

Apply

11.0 - 14.0 years

13 - 16 Lacs

Hyderabad

Work from Office

We are looking for a skilled Senior Azure/Fabric Data Engineer with 11-14 years of experience to join our team at Apps Associates (I) Pvt. Ltd, located in the IT Services & Consulting industry. Roles and Responsibility Design and implement scalable data pipelines using Azure and Fabric. Develop and maintain large-scale data architectures to support business intelligence and analytics. Collaborate with cross-functional teams to identify and prioritize project requirements. Ensure data quality and integrity by implementing robust testing and validation procedures. Optimize data storage and retrieval processes for improved performance and efficiency. Provide technical guidance and mentorship to junior team members. Job Strong understanding of data engineering principles and practices. Experience with Azure and Fabric technologies is highly desirable. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a fast-paced environment. Strong communication and interpersonal skills. Experience with agile development methodologies is preferred.

Posted 4 weeks ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Responsibilities: Design and implement scalable data pipelines to ingest, process, and analyze large volumes of structured and unstructured data from various sources. Develop and optimize data storage solutions, including data warehouses, data lakes, and NoSQL databases, to support efficient data retrieval and analysis. Implement data processing frameworks and tools such as Apache Hadoop, Spark, Kafka, and Flink to enable real-time and batch data processing. Collaborate with data scientists and analysts to understand data requirements and develop solutions that enable advanced analytics, machine learning, and reporting. Ensure data quality, integrity, and security by implementing best practices for data governance, metadata management, and data lineage. Monitor and troubleshoot data pipelines and infrastructure to ensure reliability, performance, and scalability. Develop and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources and transform it into usable formats. Stay current with emerging technologies and trends in big data and cloud computing, and evaluate their applicability to enhance our data engineering capabilities. Document data architectures, pipelines, and processes to ensure clear communication and knowledge sharing across the team. Strong programming skills in Java, Python, or Scala.Strong understanding of data modelling, data warehousing, and ETL processes. Min 4 to Max 6yrs of Relevant exp.Strong understanding of Big Data technologies and their architectures, including Hadoop, Spark, and NoSQL databases. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies