Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 10.0 years
7 - 14 Lacs
Bengaluru
Hybrid
Roles and Responsibilities Architect and incorporate an effective Data framework enabling end to end Data Solution. Understand business needs, use cases and drivers for insights and translate them into detailed technical specifications. Create epics, features and user stories with clear acceptance criteria for execution and delivery by the data engineering team. Create scalable and robust data solution designs that incorporate governance, security and compliance aspects. Develop and maintain logical and physical data models and work closely with data engineers, data analysts and data testers for successful implementation of them. Analyze, assess and design data integration strategies across various sources and platforms. Create project plans and timelines while monitoring and mitigating risks and controlling progress of the project. Conduct daily scrum with the team with a clear focus on meeting sprint goals and timely resolution of impediments. Act as a liaison between technical teams and business stakeholders and ensure. Guide and mentor the team for best practices on Data solutions and delivery frameworks. Actively work, facilitate and support the stakeholders/ clients to complete User Acceptance Testing ensure there is strong adoption of the data products after the launch. Defining and measuring KPIs/KRA for feature(s) and ensuring the Data roadmap is verified through measurable outcomes Prerequisites 5 to 8 years of professional, hands on experience building end to end Data Solution on Cloud based Data Platforms including 2+ years working in a Data Architect role. Proven hands on experience in building pipelines for Data Lakes, Data Lake Houses, Data Warehouses and Data Visualization solutions Sound understanding of modern Data technologies like Databricks, Snowflake, Data Mesh and Data Fabric. Experience in managing Data Life Cycle in a fast-paced, Agile / Scrum environment. Excellent spoken and written communication, receptive listening skills, and ability to convey complex ideas in a clear, concise fashion to technical and non-technical audiences Ability to collaborate and work effectively with cross functional teams, project stakeholders and end users for quality deliverables withing stipulated timelines Ability to manage, coach and mentor a team of Data Engineers, Data Testers and Data Analysts. Strong process driver with expertise in Agile/Scrum framework on tools like Azure DevOps, Jira or Confluence Exposure to Machine Learning, Gen AI and modern AI based solutions. Experience Technical Lead Data Analytics with 6+ years of overall experience out of which 2+ years is on Data architecture. Education Engineering degree from a Tier 1 institute preferred. Compensation The compensation structure will be as per industry standards
Posted 1 week ago
4.0 - 8.0 years
15 - 22 Lacs
Hyderabad
Work from Office
Senior Data Engineer Cloud & Modern Data Architectures Role Overview: We are looking for a Senior Data Engineer with expertise in ETL/ELT, Data Engineering, Data Warehousing, Data Lakes, Data Mesh, and Data Fabric architectures . The ideal candidate should have hands-on experience in at least one or two cloud data platforms (AWS, GCP, Azure, Snowflake, or Databricks) and a strong foundation in building PoCs, mentoring freshers, and contributing to accelerators and IPs. Must-Have: 5-8 years of experience in Data Engineering & Cloud Data Services . Hands-on with AWS (Redshift, Glue), GCP (BigQuery, Dataflow), Azure (Synapse, Data Factory), Snowflake, Databricks . Strong SQL, Python, or Scala skills. Knowledge of Data Mesh & Data Fabric principles . Nice-to-Have: Exposure to MLOps, AI integrations, and Terraform/Kubernetes for DataOps . Contributions to open-source, accelerators, or internal data frameworks Interested candidates share cv to dikshith.nalapatla@motivitylabs.com with below mention details for quick response. Total Experience: Relevant DE Experience : SQL Experience : SQL Rating out of 5 : Python Experience: Do you have experience in any 2 clouds(yes/no): Mention the cloud experience you have(Aws, Azure,GCP): Current Role / Skillset: Current CTC: Fixed: Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: (if it negotiable kindly mention up to how many days) Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: ************* 5 DAYS WORK FROM OFFICE ****************
Posted 1 week ago
3.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
Senior Manager Information Systems Automation What you will do We are seeking a hands-on , experienced and dynamic Technical Infrastructure Automation Manager to lead and manage our infrastructure automation initiatives. The ideal candidate will have a strong hands-on background in IT infrastructure, cloud services, and automation tools, along with leadership skills to guide a team towards improving operational efficiency, reducing manual processes, and ensuring scalability of systems. This role will lead a team of engineers across multiple functions, including Ansible Development, ServiceNow Development, Process Automation, and Site Reliability Engineering (SRE). This role will be responsible for ensuring the reliability, scalability, and security of automation services. The Infrastructure Automation team will be responsible for automating infrastructure provisioning, deployment, configuration management, and monitoring. You will work closely with development, operations, and security teams to drive automation solutions that enhance the overall infrastructures efficiency and reliability. This role demands the ability to drive and deliver against key organizational strategic initiatives, foster a collaborative environment, and deliver high-quality results in a matrixed organizational structure. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: Automation Strategy & Leadership : Lead the development and implementation of infrastructure automation strategies. Collaborate with key collaborators (DevOps, IT Operations, Security, etc.) to define automation goals and ensure alignment with company objectives. Provide leadership and mentorship to a team of engineers, ensuring continuous growth and skill development. Infrastructure Automation : Design and implement automation frameworks for infrastructure provisioning, configuration management, and orchestration (e.g., using tools like Terraform, Ansible, Puppet, Chef, etc.). Manage and optimize CI/CD pipelines for infrastructure as code (IaC) to ensure seamless delivery and updates. Work with cloud providers (AWS, Azure, GCP) to implement automation solutions for managing cloud resources and services. Process Improvement : Identify areas for process improvement by analyzing current workflows, systems, and infrastructure operations. Create and implement solutions to reduce operational overhead and increase system reliability, scalability, and security. Automate and streamline recurring tasks, including patch management, backups, and system monitoring. Collaboration & Communication : Collaborate with multi-functional teams (Development, IT Operations, Security, etc.) to ensure infrastructure automation aligns with business needs. Regularly communicate progress, challenges, and successes to management, offering insights on how automation is driving efficiencies. Documentation & Standards : Maintain proper documentation for automation scripts, infrastructure configurations, and processes. Develop and enforce best practices and standards for automation and infrastructure management. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree with 8-10 years of experience in Observability operation, with at least 3 years in management OR Bachelor's degree with 10-14years of experience in Observability Operations, with at least 4 years in management OR Diploma with 14-18 years of experience in Observability Operations, with at least 5 years in management 12+ years of experience in IT infrastructure management, with at least 4+ years in a leadership or managerial role. Strong expertise in automation tools and frameworks such as Terraform, Ansible, Chef, Puppet, or similar. Proficiency in scripting languages (e.g., Python, Bash, PowerShell). Hands-on experience with cloud platforms (AWS) and containerization technologies (Docker, Kubernetes). Hands-on of Infrastructure as Code (IaC) principles and CI/CD pipeline implementation. Experience with ServiceNow Development and Administration Solid understanding of networking, security protocols, and infrastructure design. Excellent problem-solving skills and the ability to troubleshoot complex infrastructure issues. Strong leadership and communication skills, with the ability to work effectively across teams. Professional Certifications (Preferred): ITIL or PMP Certification Red Hat Certified System Administrator Service Now Certified System Administrator AWS Certified Solutions Architect Preferred Qualifications: Strong experience with Ansible, including playbooks, roles, and modules. Strong experience with infrastructure-as-code concepts and other automation tools like Terraform or Puppet. Strong understanding of user-centered design and building scalable, high-performing web and mobile interfaces on the ServiceNow platform Proficiency with both Windows and Linux/Unix-based operating systems. Knowledge of cloud platforms (AWS, Azure, Google Cloud) and automation techniques in those environments. Familiarity with CI/CD tools and processes, particularly with integration of Ansible in pipelines. Understanding of version control systems (Git). Strong troubleshooting, debugging, and performance optimization skills. Experience with hybrid cloud environments and multi-cloud strategies. Familiarity with DevOps practices and tools. Experience operating within a validated systems environment (FDA, European Agency for the Evaluation of Medicinal Products, Ministry of Health, etc.) Soft Skills: Excellent leadership and team management skills. Change management expertise Crisis management capabilities Strong presentation and public speaking skills Analytical mindset with a focus on continuous improvement. Detail-oriented with the capacity to manage multiple projects and priorities. Self-motivated and able to work independently or as part of a team. Strong communication skills to effectively interact with both technical and non-technical collaborators. Ability to work effectively with global, virtual teams Shift Information: This position is an onsite role and may require working during later hours to align with business hours. Candidates must be willing and able to work outside of standard hours as required to meet business needs.
Posted 1 week ago
8.0 - 13.0 years
30 - 45 Lacs
Bengaluru
Hybrid
Job Title: Enterprise Data Architect | Immediate Joiner Experience: 8 15 Years Location: Bengaluru (Onsite/Hybrid) Joining Time: Immediate Joiners Only (015 Days) Job Description We are looking for an experienced Enterprise Data Architect to join our dynamic team in Bengaluru. This is an exciting opportunity to shape modern data architecture across finance and colleague (HR) domains using the latest technologies and design patterns. Key Responsibilities Design and implement conceptual and logical data models for finance and colleague domains. Define complex as-is and to-be data architectures, including transition states. Develop and maintain data standards, principles, and architecture artifacts. Build scalable solutions using data lakes, data warehouses, and data governance platforms. Ensure data lineage, quality, and consistency across platforms. Translate business requirements into technical solutions for data acquisition, storage, transformation, and governance. Collaborate with cross-functional teams for data solution design and delivery Required Skills Strong communication and stakeholder engagement. Hands-on experience with Kimball dimensional modeling and/or Snowflake modeling. Expertise in modern cloud data platforms and architecture (AWS, Azure, or GCP). Proficient in building solutions for web, mobile, and tablet platforms. Background in Finance and/or Colleague Technology (HR systems) is a strong plus. Preferred Qualifications Bachelors/Masters degree in Computer Science, Engineering, or a related field. 8–15 years of experience in data architecture and solution design. Important Notes Immediate Joiners Only (Notice period max of 15 days) Do not apply if you’ve recently applied or are currently in the Xebia interview process Location: Bengaluru – candidates must be based in or open to relocating immediately To Apply Send your updated resume with the following details to vijay.s@xebia.com: Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: LinkedIn URL: Apply now and be part of our exciting transformation journey at Xebia!
Posted 1 week ago
10.0 - 12.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 323226 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Sr. Advisor to join our team in Bengaluru, India, Karn?taka (IN-KA), India (IN). Key Responsibilities: Design data platform architectures (data lakes, lakehouses, DWH) using modern cloud-native tools (e.g., Databricks, Snowflake, BigQuery, Synapse, Redshift). Architect data ingestion, transformation, and consumption pipelines using batch and streaming methods. Enable real-time analytics and machine learning through scalable and modular data frameworks. Define data governance models, metadata management, lineage tracking, and access controls. Collaborate with AI/ML, application, and business teams to identify high-impact use cases and optimize data usage. Lead modernization initiatives from legacy data warehouses to cloud-native and distributed architectures. Enforce data quality and observability practices for mission-critical workloads. Required Skills: 10+ years in data architecture, with strong grounding in modern data platforms and pipelines. Deep knowledge of SQL/NoSQL, Spark, Delta Lake, Kafka, ETL/ELT frameworks. Hands-on experience with cloud data platforms (AWS, Azure, GCP). Understanding of data privacy, security, lineage, and compliance (GDPR, HIPAA, etc.). Experience implementing data mesh/data fabric concepts is a plus. Expertise in technical solutions writing and presenting using tools such as Word, PowerPoint, Excel, Visio etc. High level of executive presence to be able to articulate the solutions to CXO level executives. Preferred Qualifications: Certifications in Snowflake, Databricks, or cloud-native data platforms. Exposure to AI/ML data pipelines, MLOps, and real-time data applications. Familiarity with data visualization and BI tools (Power BI, Tableau, Looker, etc.). About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 2 weeks ago
13.0 - 21.0 years
45 - 60 Lacs
Hyderabad
Hybrid
Job Description Summary: As a Data Architect, you will play a pivotal role in defining and implementing common data models, API standards, and leveraging the Common Information Model (CIM) standard across a portfolio of products deployed in Critical National Infrastructure (CNI) environments globally. GE Vernova is the leading software provider for the operations of national and regional electricity grids worldwide. Our software solutions range from supporting electricity markets, enabling grid and network planning, to real-time electricity grid operations. In this senior technical role, you will collaborate closely with lead software architects to ensure secure, performant, and composable designs and implementations across our portfolio. Job Description Grid Software (a division of GE Vernova) is driving the vision of GridOS - a portfolio of software running on a common platform to meet the fast-changing needs of the energy sector and support the energy transition. Grid Software has extensive and well-established software stacks that are progressively being ported to a common microservice architecture, delivering a composable suite of applications. Simultaneously, new applications are being designed and built on the same common platform to provide innovative solutions that enable our customers to accelerate the energy transition. This role is for a senior data architect who understands the core designs, principles, and technologies of GridOS. Key responsibilities include: Formalizing Data Models and API Standards : Lead the formalization and standardization of data models and API standards across products to ensure interoperability and efficiency. Leveraging CIM Standards : Implement and advocate for the Common Information Model (CIM) standards to ensure consistent data representation and exchange across systems. Architecture Reviews and Coordination : Contribute to architecture reviews across the organization as part of Architecture Review Boards (ARB) and the Architecture Decision Record (ADR) process. Knowledge Transfer and Collaboration : Work with the Architecture SteerCo and Developer Standard Practices team to establish standard pratcise around data modeling and API design. Documentation : Ensure that data modeling and API standards are accurately documented and maintained in collaboration with documentation teams. Backlog Planning and Dependency Management : Work across software teams to prepare backlog planning, identify, and manage cross-team dependencies when it comes to data modeling and API requirements. Key Knowledge Areas and Expertise Data Architecture and Modeling : Extensive experience in designing and implementing data architectures and common data models. API Standards : Expertise in defining and implementing API standards to ensure seamless integration and data exchange between systems. Common Information Model (CIM) : In-depth knowledge of CIM standards and their application within the energy sector. Data Mesh and Data Fabric : Understanding of data mesh and data fabric principles, enabling software composability and data-centric design trade-offs. Microservice Architecture : Understandig of microservice architecture and software development Kubernetes : Understanding of Kubernetes, including software development in an orchestrated microservice architecture. This includes Kubernetes API, custom resources, API aggregation, Helm, and manifest standardization. CI/CD and DevSecOps : Experience with CI/CD pipelines, DevSecOps practices, and GitOps, especially in secure, air-gapped environments. Mobile Software Architecture : Knowledge of mobile software architecture for field crew operations, offline support, and near-realtime operation. Additional Knowledge (Advantageous but not Essential) Energy Industry Technologies : Familiarity with key technologies specific to the energy industry, such as Supervisory Control and Data Acquisition (SCADA), Geospatial network modeling, etc. This is a critical role within Grid Software, requiring a broad range of knowledge and strong organizational and communication skills to drive common architecture, software standards, and principles across the organization.
Posted 2 weeks ago
8.0 - 10.0 years
25 - 35 Lacs
Pune
Hybrid
Warm Greetings from Dataceria Software Solutions Pvt Ltd . We are Looking For: Senior Data Scientist Immediate joiners send your resume to carrers@dataceria.com: -------------------------------------------------------------------------------------------------------------- We are seeking a highly skilled Senior Data Scientist to lead the development of classification models and customer segmentation strategies for a major investment bank. This role is central to profiling the existing client base on legacy infrastructure and supporting business stakeholders in defining clear migration paths to modern platforms such as Data Mesh. Responsibilities: Lead the design and implementation of classification models to categorize clients based on product usage, service engagement, and behavioral patterns. Develop robust customer segmentation strategies to support personalization and migration strategies. Collaborate closely with stakeholders across business and technology to understand legacy data landscapes and future-state architecture. Oversee and mentor a small team of data scientists and analysts. Conduct advanced analysis using Python and SQL; apply machine learning techniques for predictive insights. Translate complex data findings into clear business narratives and actionable outcomes. What we're looking for in our applicants: 8+ years of experience in data science, preferably in financial services or investment banking . Proven expertise in machine learning, particularly classification and clustering models. Advanced proficiency in Python ( Pandas, Scikit-learn, etc.) and SQL. Experience leading technical teams or mentoring junior data professionals. Strong understanding of client lifecycle in financial institutions and regulatory considerations. Familiarity with data migration strategies and legacy modernization projects is a plus. ---------------------------------------------------------------------------------------------------------------------------------------------------- Joining: Immediate Work location: Pune (hybrid). Open Positions: Senior Data Scientist If interested, please share your updated resume to carrers@dataceria.com: We welcome applications from skilled candidates who are open to working in a hybrid model. Candidates with less experience but strong technical abilities are also encouraged to apply. ----------------------------------------------------------------------------------------------------- Thanks, Dataceria Software Solutions Pvt Ltd Follow our LinkedIn for more job openings : https://www.linkedin.com/company/dataceria/ Email : careers@dataceria.com ------------------------------------------------------------------------------------------------------
Posted 2 weeks ago
7.0 - 12.0 years
0 - 3 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow. Good to Have: DBT, Data mesh Job Title: Senior GCP Engineer Data Mesh & Data Product Specialist We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset – you take initiative without waiting for instructions. 2. A commitment to excellence – no shortcuts or compromises on quality. 3. Accountability – you own your work end-to-end and deliver on time. 4. Attention to detail – precision matters; mistakes are not acceptable.
Posted 2 weeks ago
8.0 - 10.0 years
25 - 35 Lacs
Chennai, Bengaluru
Hybrid
Warm Greetings from Dataceria Software Solutions Pvt Ltd . We are Looking For: Senior Data Scientist -------------------------------------------------------------------------------------------------------------- We are seeking a highly skilled Senior Data Scientist to lead the development of classification models and customer segmentation strategies for a major investment bank. This role is central to profiling the existing client base on legacy infrastructure and supporting business stakeholders in defining clear migration paths to modern platforms such as Data Mesh. Responsibilities: Lead the design and implementation of classification models to categorize clients based on product usage, service engagement, and behavioral patterns. Develop robust customer segmentation strategies to support personalization and migration strategies. Collaborate closely with stakeholders across business and technology to understand legacy data landscapes and future-state architecture. Oversee and mentor a small team of data scientists and analysts. Conduct advanced analysis using Python and SQL; apply machine learning techniques for predictive insights. Translate complex data findings into clear business narratives and actionable outcomes. What we're looking for in our applicants: 8+ years of experience in data science, preferably in financial services or investment banking . Proven expertise in machine learning, particularly classification and clustering models. Advanced proficiency in Python ( Pandas, Scikit-learn, etc.) and SQL. Experience leading technical teams or mentoring junior data professionals. Strong understanding of client lifecycle in financial institutions and regulatory considerations. Familiarity with data migration strategies and legacy modernization projects is a plus. ---------------------------------------------------------------------------------------------------------------------------------------------------- Joining: Immediate Work location: Bangalore (hybrid) , Chennai. Open Positions: Senior Data Scientist If interested, please share your updated resume to carrers@dataceria.com: We welcome applications from skilled candidates who are open to working in a hybrid model. Candidates with less experience but strong technical abilities are also encouraged to apply. ----------------------------------------------------------------------------------------------------- Thanks, Dataceria Software Solutions Pvt Ltd Follow our LinkedIn for more job openings : https://www.linkedin.com/company/dataceria/ Email : careers@dataceria.com ------------------------------------------------------------------------------------------------------
Posted 3 weeks ago
8 - 13 years
25 - 30 Lacs
Chennai, Bangalore Rural, Hyderabad
Work from Office
Company Name: (One of the Leading General Insurance company in India,( Chennai) Industry: General Insurance Years of Experience 7+ Years Location -Chennai Mail at manjeet.kaur@mounttalent.com Purpose The candidate is responsible for designing, creating, deploying, and maintaining an organization's data architecture. To ensure that the organization's data assets are managed effectively and efficiently, and that they are used to support the organization's goals and objectives. Responsible for ensuring that the organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Key Responsibilities Responsibilities will include but will not be restricted to: Responsible for designing and implementing a data architecture that supports the organization's business goals and objectives. Developing data models, defining data standards and guidelines, and establishing processes for data integration, migration, and management. Create and maintain data dictionaries, which are a comprehensive set of data definitions and metadata that provide context and understanding of the organization's data assets. Ensure that the data is accurate, consistent, and reliable across the organization. This includes establishing data quality metrics and monitoring data quality on an ongoing basis. Organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Work closely with other IT professionals, including database administrators, data analysts, and developers, to ensure that the organization's data architecture is integrated and aligned with other IT systems and applications. Stay up to date with new technologies and trends in data management and architecture and evaluate their potential impact on the organization's data architecture. Communicate with stakeholders across the organization to understand their data needs and ensure that the organization's data architecture is aligned with the organization's strategic goals and objectives. Technical requirements Bachelor's or masters degree in Computer Science or a related field. Certificates in Database Management will be preferred. Expertise in data modeling and design, including conceptual, logical, and physical data models, and must be able to translate business requirements into data models. Proficient in a variety of data management technologies, including relational databases, NoSQL databases, data warehouses, and data lakes. Expertise in ETL processes, including data extraction, transformation, and loading, and must be able to design and implement data integration processes. Experience with data analysis and reporting tools and techniques and must be able to design and implement data analysis and reporting processes. Familiar with industry-standard data architecture frameworks, such as TOGAF and Zachman, and must be able to apply them to the organization's data architecture. Familiar with cloud computing technologies, including public and private clouds, and must be able to design and implement data architectures that leverage cloud computing. Qualitative Requirements Able to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Strong analytical and problem-solving skills. Must be able to inspire and motivate their team to achieve organizational goal. Following skills can be deemed good to have but not necessary: Databricks, Snowflake, Redshift, Data Mesh, Medallion, Lambda
Posted 1 month ago
8 - 12 years
12 - 14 Lacs
Bengaluru
Work from Office
Position Description: The Boeing Company is currently seeking a high performing versatile Experienced Programmer Analyst_Data Engineer to join the Product Systems build team. The Product Systems build team provides comprehensive software solutions to rapidly access and visually transform complex engineering and manufacturing product data. Responsibilities include development and integration for a variety of Commercial off the Shelf (COTS) and in house software application supporting our engineering/manufacturing teams. Job requires working within a diverse team of skilled and motivated co-workers to collaborate on results. Other qualities for this candidate are a positive attitude, self-motivated, the ability to work in a fast-paced, demanding environment, and the ability to adapt to changing priorities. Essential Job Functions/Responsibilities: Hands-on experience in understanding aerospace domain specific data Must coordinate with data scientists in data preparation, exploration and making data ready. Must have clear understanding of defining data products and monetizing. Must have experience in building self-service capabilities to users. Build quality checks across the data lineage and responsible in designing and implementing different data patterns. Works on prototyping and evaluates technical feasibility. Can influence different stakeholders for funding and building the vision of the product in terms of usage, productivity, and scalability of the solutions. Build impactful or outcome-based solutions/products. Qualification: 7+ years of experience as a data engineer. Strong understanding of Datawarehouse concepts, Data Lake, and data mesh Familiar with ETL tools and Data ingestion patterns Hands on experience in building data pipelines using Azure. Hands on experience in writing complex SQL and No- SQL Hands on experience with data pipeline orchestration tools such as Azure data Factory Hands on experience on Data Modelling Experience on Data visualization using Power BI or Tableau Experience in working with Global teams with global mindset. Mandatory Skills: Experience in Core Java/Python, SQL, Data Modelling, Airflow & Spark Experience in BQ, CloudSQL, PubSUB, BigTable, Terraform, DBMS, Dataflow and GCS Experience in Azure Services Education: Technical bachelor's degree and typically 8 - 12 years of related work experience. A technical degree is defined as any four year degree, or greater, in a mathematic, scientific or information technology field of study. Relocation: This position does offer relocation based on candidate eligibility within INDIA
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2