Jobs
Interviews

41 Data Platform Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

12 - 17 Lacs

Chennai

Work from Office

Position Purpose Provide a brief description of the overall purpose of the position, why this position exists and how it will contribute in achieving the teams goal. The requested position is developer-analyst in an open environment, which requires knowledge of the mainframe, TSO, JCL, OPC environment. Responsibilities Direct Responsibilities For a predefined applications scope take care of: Design Implementation (coding / parametrization, unit test, assembly test, integration test, system test, support during functional/acceptance test) Roll-out support Documentation Continuous Improvement Ensure that SLA targets are met for above activities Handover to Italian teams if knowledge and skills are not available in ISPL Coordinate closely with Data Platform Teamss and also all other BNL BNP Paribas IT teams (Incident coordination, Security, Infrastructure, Development teams, etc.) Collaborate and support Data Platform Teams to Incident Management, Request Management and Change Management Contributing Responsibilities Contribute to the knowledge transfer with BNL Data Platform team Help build team spirit and integrate into BNL BNP Paribas culture Contribute to incidents analysis and associated problem management Contribute to the acquisition by ISPL team of new skills knowledge to expand its scope Technical Behavioral Competencies Fundamental skills: o IBM DataStage o SQL o Experience with Data Modeling and tool ERWin Important skill - knowledge of at least one of database technologies is required: o Teradata o Oracle o SQL Server. Basic knowledge about Mainframe usage TSO, ISPF/S, Scheduler IWS, JCL Nice to have: o Knowledge of MS SSIS o Experience with Service Now ticketing system o Knowledge of Requirements Collection, Analysis, Design, Development and Test activity o Continuous improvement approaches o Knowledge of Python o Knowledge and experience with RedHat Linux, Windows, AIX, WAS, CFT Specific Qualifications (if required) Basic knowledge of Italian language can be an advantage Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Ability to share / pass on knowledge Ability to deliver / Results driven Adaptability Transversal Skills: (Please select up to 5 skills) Ability to develop others improve their skills Ability to manage / facilitate a meeting, seminar, committee, training Choose an item. Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 3 years

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Work from Office

Position Purpose The Application Maintenance team member will support the ISPL Data Platform team to ensure the stability and the correct execution of all the applications in Data Platform scope. He will support the development team and will help BNL Data Platform Internal team to analyze and propose solutions and on Business requests. He will be in charge of small development activities. He will also propose solutions to improve the performance and prevent failures on managed applications. Responsibilities Direct Responsibilities Coordinate closely with Data Platform Teamss and also all other BNL BNP Paribas IT teams (Incident coordination, Security, Infrastructure, Development teams, etc.) For a predefined applications scope take care of: o Ticket Management o Propose solutions to improve an application o Incident Management (including problem determination) o Request Management o Change Management Ensure that SLA targets are met for above activities Handover to Italian teams if knowledge and skills are not available in ISPL Contributing Responsibilities Contribute to the definition of procedures and processes necessary for the team Help build team spirit and integrate into BNL BNP Paribas culture Contribute to incidents analysis and associated problem management Contribute to the regular activity reporting and KPI calculation Contribute to the knowledge transfer with BNL Data Platform team Contribute to the acquisition by ISPL team of new skills & knowledge to expand its scope Technical & Behavioral Competencies Fundamental skills: o Knowledge about Mainframe usage TSO, ISPF/S, Scheduler , JCL o Knowledge about IBM Datastage ETL Tool o Familiarity with database technology is required (Teradata, Oracle, DB2, SQL Server) o SQL Languange in order to execute basic scripts and queries. Have basic experience with: o Service Now ticketing system o Aurelia Remedy ticketing system Nice to have: o General IT infrastructure knowledge o Knowledge of Requirements Collection, Analysis, Design, Development and Test activity o Continuous improvement approaches Good written and spoken English Able to communicate efficiently Good Team Player Specific Qualifications (if required) Basic knowledge of Italian language can be an advantage Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Client focused Ability to deliver / Results driven Ability to share / pass on knowledge Transversal Skills: (Please select up to 5 skills) Ability to develop and adapt a process Ability to anticipate business / strategic evolution Ability to set up relevant performance indicators Ability to understand, explain and support change Choose an item. Education Level: Bachelor Degree or equivalent

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Pune, Maharashtra, India

On-site

The Role: We are seeking a Senior Software Engineer who will: Perform data ingestion, aggregation, processing to drive and enable relevant insights from available data sets. Partner with various teams (i.e., Product Manager, Data Science, Platform Strategy, Technology) on data needs/requirements in order to deliver data solutions that generate business value. Manipulate and analyze complex, high-volume, high-dimensionality data from varying sources using a variety of tools and data analysis techniques. Identify innovative ideas and deliver proof of concepts, prototypes to deliver against the existing and future needs and propose new products, services and enhancements. Integrate & Unify new data assets which increase the value proposition for our customers and enhance our existing solutions and services. Analyse large volumes of transaction and product data to generate insights and actionable recommendations to drive business growth Collect and synthesize feedback from clients, development, product and sales teams for new solutions or product enhancements. Apply knowledge of metrics, measurements, and benchmarking to complex and demanding solutions. All about You : Minimum 5-8 years of relevant experience. Good understanding of programming language preferably PySpark and Big Data technologies. Experience with Enterprise Business Intelligence Platform/Data platform. Strong SQL and higher-level programming languages with solid knowledge of data mining, machine learning algorithms and tools Experience with data integration tools - ETL/ELT tools (i.e. Apache NiFi, Azure Data Factory, Databricks) Exposure to collecting and/or working with data including standardizing, summarizing, offering initial observations and highlighting inconsistencies. Strong understanding of the application of analytical methods and data visualization to support business decisions. Ability to understand complex operational systems and analytics/business intelligence tools for the delivery of information products and analytical offerings to a large, global user base. Able to work in a fast-paced, deadline-driven environment as part of a team and as an individual contributor Ability to easily move between business, analytical, and technical teams and articulate solution requirements for each group

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 30 Lacs

Bengaluru

Remote

Company name: PulseData labs Pvt Ltd Job Summary: We are seeking a detail-oriented and results-driven Project Manager to lead projects within our DevOps team, with a strong focus on Devops project implementations . The ideal candidate will have experience managing end-to-end delivery of AWS cloud-based DevOps projects and collaborating cross-functionally across engineering, analytics, DevOps, and business stakeholders. Key Responsibilities: Lead and manage full lifecycle projects related to data platform initiatives, especially Databricks-based solutions across AWS or Azure. Develop and maintain project plans, schedules, budgets, and resource forecasts using tools like Jira, MS Project, or similar. Coordinate across technical teams (engineering, ML, DevOps) and business units to define scope, deliverables, and success metrics. Facilitate sprint planning, daily stand-ups, retrospectives, and status reporting following Agile/Scrum or hybrid methodologies. Identify risks, dependencies, and blockers early; drive resolution through mitigation plans and stakeholder communication. Manage vendor relationships (where applicable), ensuring delivery quality, alignment with architecture standards, and on-time execution. Ensure compliance with data governance, security, and documentation standards. Communicate regularly with senior leadership on project status, KPIs, and key decisions. Required Qualifications: 5+ years of experience managing technical or data-related projects, with at least 2+ years in cloud data platforms . Proven experience leading projects involving AWS. Solid understanding of Agile delivery practices, change management, and cross-functional coordination. Proficiency in project tracking tools (Jira, Confluence, Smartsheet, or Microsoft Project). Exceptional written and verbal communication skills; able to translate technical concepts to business audiences. Preferred Qualifications: PMP, PMI-ACP, or Certified Scrum Master (CSM) certification. Prior experience on multi-cloud platforms is preferred Familiarity with tools such as Airflow, Unity Catalog, Power BI/Tableau, and Git-based CI/CD processes. Soft Skills: Strong leadership and stakeholder management Proactive problem solver with a bias for execution Excellent time management and multitasking ability Comfortable working in a fast-paced, evolving environment

Posted 1 month ago

Apply

10.0 - 15.0 years

10 - 14 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Essential Skills/Experience: - A Masters Degree in Computer Science, Information Systems, Engineering, Business, or related scientific/technical field preferred. - Minimum of 10 years of experience in data engineering, business analysis, and data management. - Exceptional verbal and written communication skills; ability to convey analytical insights in actionable business terms. - Highly motivated self-starter with confidence to present complex information effectively to all audiences. - Strong analytical, logical thinking, and organizational skills; capable of managing multiple projects simultaneously. - Ability to anticipate future business trends and integrate them into IT and business practices. - Proven track record of effective functional and multi-functional collaboration and leadership. - Diligent self-starter; able to work independently and in a team environment. - Desire and ability to learn/implement new tools and analytic capabilities. - Experience designing methods, processes, and systems for consolidating and analyzing structured/unstructured data from diverse sources. - Experience developing advanced software applications, algorithms, querying, and automated processes for data evaluation. - Proven ability to design complex, large-scale data solutions that are scalable, robust, secure, and resilient. - Pharmaceutical or Life Sciences industry experience a plus. - Experience using dbT, Fivetran, GitHub, Apache Airflow. - Extensive hands-on experience with SQL, Python, ETL/ELT frameworks, and data orchestration pipelines. - AWS Architecture Framework knowledge and certification. - Expertise in Snowflake concepts like resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, data sharing, time travel, SnowSQL, SnowPipe, Streamlit, Cortex. - Experience in data quality and observability tools/methodologies. - Understanding of FAIR and TRUSTed data product principles. - Knowledge of data governance frameworks/compliance standards relevant to life sciences industry (GDPR/HIPAA). - Experience with ETL/ELT/Data Loading tools using Apache Airflow, AWS Glue with Python. - Experience bringing to bear AI technologies for ELT processes and automating self-healing data pipelines. - Experience working with data science operations teams using serverless architectures, Kubernetes, Docker/containerization. - Solid understanding of analytic data architecture/data modeling concepts/principles (data lakes/warehouses/marts). - Data warehousing methodologies/modeling techniques (Kimball/3NF/Star Schema). Desirable Skills/Experience: - Prior experience of 10+ years as a Data Platform or Technical Leader in biotech/pharma industry. - Advanced experience with cloud platforms beyond AWS (Azure/Google Cloud/Databricks) for data engineering/storage solutions.

Posted 1 month ago

Apply

5.0 - 8.0 years

20 - 30 Lacs

Bengaluru

Remote

Company name: PulseData labs Pvt Ltd (captive Unit for URUS, USA) About URUS We are the URUS family (US), a global leader in products and services for Agritech. Job Summary: We are seeking a detail-oriented and results-driven Project Manager to lead projects within our DevOps team, with a strong focus on Devops project implementations . The ideal candidate will have experience managing end-to-end delivery of AWS cloud-based DevOps projects and collaborating cross-functionally across engineering, analytics, DevOps, and business stakeholders. Key Responsibilities: Lead and manage full lifecycle projects related to data platform initiatives, especially Databricks-based solutions across AWS or Azure. Develop and maintain project plans, schedules, budgets, and resource forecasts using tools like Jira, MS Project, or similar. Coordinate across technical teams (engineering, ML, DevOps) and business units to define scope, deliverables, and success metrics. Facilitate sprint planning, daily stand-ups, retrospectives, and status reporting following Agile/Scrum or hybrid methodologies. Identify risks, dependencies, and blockers early; drive resolution through mitigation plans and stakeholder communication. Manage vendor relationships (where applicable), ensuring delivery quality, alignment with architecture standards, and on-time execution. Ensure compliance with data governance, security, and documentation standards. Communicate regularly with senior leadership on project status, KPIs, and key decisions. Required Qualifications: 5+ years of experience managing technical or data-related projects, with at least 2+ years in cloud data platforms . Proven experience leading projects involving AWS. Solid understanding of Agile delivery practices, change management, and cross-functional coordination. Proficiency in project tracking tools (Jira, Confluence, Smartsheet, or Microsoft Project). Exceptional written and verbal communication skills; able to translate technical concepts to business audiences. Preferred Qualifications: PMP, PMI-ACP, or Certified Scrum Master (CSM) certification. Prior experience on multi-cloud platforms is preferred Familiarity with tools such as Airflow, Unity Catalog, Power BI/Tableau, and Git-based CI/CD processes. Soft Skills: Strong leadership and stakeholder management Proactive problem solver with a bias for execution Excellent time management and multitasking ability Comfortable working in a fast-paced, evolving environment

Posted 1 month ago

Apply

13.0 - 20.0 years

40 - 45 Lacs

Bengaluru

Work from Office

Principal Architect - Platform & Application Architect Experience 15+ years in software/data platform architecture 5+ years in architectural leadership roles Architecture & Data Platform Expertise Education Bachelors/Master’s in CS, Engineering, or related field Title: Principal Architect Location: Onsite Bangalore Experience: 15+ years in software & data platform architecture and technology strategy Role Overview We are seeking a Platform & Application Architect to lead the design and implementation of a next-generation, multi-domain data platform and its ecosystem of applications. In this strategic and hands-on role, you will define the overall architecture, select and evolve the technology stack, and establish best practices for governance, scalability, and performance. Your responsibilities will span across the full data lifecycle—ingestion, processing, storage, and analytics—while ensuring the platform is adaptable to diverse and evolving customer needs. This role requires close collaboration with product and business teams to translate strategy into actionable, high-impact platform & products. Key Responsibilities 1. Architecture & Strategy Design the end-to-end architecture for a On-prem / hybrid data platform (data lake/lakehouse, data warehouse, streaming, and analytics components). Define and document data blueprints, data domain models, and architectural standards. Lead build vs. buy evaluations for platform components and recommend best-fit tools and technologies. 2. Data Ingestion & Processing Architect batch and real-time ingestion pipelines using tools like Kafka, Apache NiFi, Flink, or Airbyte. Oversee scalable ETL/ELT processes and orchestrators (Airflow, dbt, Dagster). Support diverse data sources: IoT, operational databases, APIs, flat files, unstructured data. 3. Storage & Modeling Define strategies for data storage and partitioning (data lakes, warehouses, Delta Lake, Iceberg, or Hudi). Develop efficient data strategies for both OLAP and OLTP workloads. Guide schema evolution, data versioning, and performance tuning. 4. Governance, Security, and Compliance Establish data governance , cataloging , and lineage tracking frameworks. Implement access controls , encryption , and audit trails to ensure compliance with DPDPA, GDPR, HIPAA, etc. Promote standardization and best practices across business units. 5. Platform Engineering & DevOps Collaborate with infrastructure and DevOps teams to define CI/CD , monitoring , and DataOps pipelines. Ensure observability, reliability, and cost efficiency of the platform. Define SLAs, capacity planning, and disaster recovery plans. 6. Collaboration & Mentorship Work closely with data engineers, scientists, analysts, and product owners to align platform capabilities with business goals. Mentor teams on architecture principles, technology choices, and operational excellence. Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 12+ years of experience in software engineering, including 5+ years in architectural leadership roles. Proven expertise in designing and scaling distributed systems, microservices, APIs, and event-driven architectures using Java, Python, or Node.js. Strong hands-on experience with building scalable data platforms on premise/Hybrid/cloud environments. Deep knowledge of modern data lake and warehouse technologies (e.g., Snowflake, BigQuery, Redshift) and table formats like Delta Lake or Iceberg. Familiarity with data mesh, data fabric, and lakehouse paradigms. Strong understanding of system reliability, observability, DevSecOps practices, and platform engineering principles. Demonstrated success in leading large-scale architectural initiatives across enterprise-grade or consumer-facing platforms. Excellent communication, documentation, and presentation skills, with the ability to simplify complex concepts and influence at executive levels. Certifications such as TOGAF or AWS Solutions Architect (Professional) and experience in regulated domains (e.g., finance, healthcare, aviation) are desirable.

Posted 1 month ago

Apply

10.0 - 15.0 years

1 - 2 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

We are looking for an experienced Finance Data Hub Platform Product Manager to own the strategic direction, development, and management of the core data platform that underpins our Finance Data Hub. This role is focused on ensuring the platform is scalable, reliable, secure, and optimized to support data ingestion, transformation, and access across the finance organization. As the Platform Product Manager, you will work closely with engineering, architecture, governance, and infrastructure teams to define the technical roadmap, prioritize platform enhancements, and ensure seamless integration with data and UI product streams. Your focus will be on enabling data products and services by ensuring the platform's core capabilities meet evolving business needs. Responsibilities Platform Strategy & Vision: Define and own the roadmap for the Finance Data Hub platform, ensuring it aligns with business objectives and supports broader data product initiatives. Technical Requirements: Collaborate with architects, data engineers, and governance teams to define and prioritise platform capabilities, including scalability, security, resilience, and data lineage. Integration Management: Ensure the platform seamlessly integrates with data streams and serves UI products, enabling efficient data ingestion, transformation, storage, and consumption. Infrastructure Coordination: Work closely with infrastructure and DevOps teams to ensure platform performance, cost optimisation, and alignment with enterprise architecture standards. Governance & Compliance: Partner with data governance and security teams to ensure the platform adheres to data management standards, privacy regulations, and security protocols. Backlog Management: Own and prioritise the platform development backlog, balancing technical needs with business priorities, and ensuring timely delivery of enhancements. Agile Leadership: Support and often lead agile ceremonies, write clear user stories focused on platform capabilities, and facilitate collaborative sessions with technical teams. Stakeholder Communication: Provide clear updates on platform progress, challenges, and dependencies to stakeholders, ensuring alignment across product and engineering teams. Continuous Improvement: Regularly assess platform performance, identify areas for optimization, and champion initiatives that enhance reliability, scalability, and efficiency. Risk Management: Identify and mitigate risks related to platform stability, security, and data integrity. Skills Proven 10+ years experience as a Product Manager focused on data platforms, infrastructure, or similar technical products. Strong understanding of data platforms and infrastructure, including data ingestion, processing, storage, and access within modern data ecosystems. Experience with cloud data platforms (e.g., Azure, AWS, GCP) and knowledge of data lake architectures. Understanding of data governance, security, and compliance best practices. Strong stakeholder management skills, particularly with technical teams (engineering, architecture, security). Experience managing product backlogs and roadmaps in an Agile environment. Ability to balance technical depth with business acumen to drive effective decision-making. Nice to have Experience with financial systems and data sources, such as HFM, Fusion, or other ERPs Knowledge of data orchestration and integration tools (e.g., Apache Airflow, Azure Data Factory). Experience with transitioning platforms from legacy technologies (e.g., Teradata) to modern solutions. Familiarity with cost optimization strategies for cloud platforms.

Posted 1 month ago

Apply

0.0 - 5.0 years

1 - 1 Lacs

Kolkata

Work from Office

Hiring freshers for a remote (3 months renewable) contract role. Permanent after 1 year. Setting up virtual labs, writing tech documentation, and supporting website performance. Laptop & internet are must. Opportunity to learn cloud and data tools.

Posted 2 months ago

Apply

4.0 - 8.0 years

5 - 12 Lacs

Bengaluru

Work from Office

If interested apply here - https://forms.gle/sBcZaUXpkttdrTtH9 Key Responsibilities Work with Product Owners and various stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions and design the scale out architecture for data platform to meet the requirements of the proposed solution. Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques, and business strategies. Play an active role in leading team meetings and workshops with clients. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Create and own the technical product backlogs for data projects, help the team to close the backlogs in right time. Help us to shape the next generation of our products. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Lead data mining and collection procedures Ensure data quality and integrity Interpret and analyze data problems Develop custom data models and algorithms to apply to data set Coordinate with different functional teams to implement models and monitor outcomes Develop processes and tools to monitor and analyze model performance and data accuracy Responsible to understand the client requirement and architect robust data platform on multiple cloud technologies. Responsible for creating reusable and scalable data pipelines Work with DE/DA/ETL/QA/Application and various other teams to remove roadblocks Align data projects with organizational goals. Skills & Qualifications Were looking for someone with 4-7 years of experience having worked through large data engineering porjects Bachelors or Masters degree in Computer Science, Engineering, Data Science, or a related field. Strong problem-solving skills with an emphasis on product development Domain - Big Data, Data Platform, Distributed Systems Coding - any language (Java/scala/python) (most import requirement) with strong knowledge of Spark Ingestion skills - one of apache storm, flink, spark Streaming skills - one of kafka, kinesis, oplogs, binlogs, debizium Database skills – HDFS, Delta Lake/Iceberg, Lakehouse If interested apply here - https://forms.gle/sBcZaUXpkttdrTtH9

Posted 2 months ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

JD: • Good experience in Apache Iceberg, Apache Spark, Trino • Proficiency in SQL and data modeling • Experience with open Data Lakehouse using Apache Iceberg • Experience with Data Lakehouse architecture with Apache Iceberg and Trino

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Platform Engineer to build scalable infrastructure for data ingestion, processing, and analysis. Key Responsibilities: Architect distributed data systems. Enable data discoverability and quality. Develop data tooling and platform APIs. Required Skills & Qualifications: Experience with Spark, Kafka, and Delta Lake. Proficiency in Python, Scala, or Java. Familiar with cloud-based data platforms. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 2 months ago

Apply

5.0 - 10.0 years

18 - 22 Lacs

Hyderabad

Work from Office

Hands-on experience with design, development, and support of data pipelines • Strong SQL programming skills (Joins, sub queries, queries with analytical functions, stored procedures, functions etc.) • Hands-on experience using statistical methods for data analysis • Experience with data platform and visualization technologies such as Google PLX dashboards, Data Studio, Tableau, Pandas, Qlik Sense, Splunk, Humio, Grafana • Experience in Web Development like HTML, CSS, jQuery, Bootstrap • Experience in Machine Learning Packages like Scikit-Learn, NumPy, SciPy, Pandas, NLTK, BeautifulSoup, Matplotlib, Statsmodels. • Strong design and development skills with meticulous attention to detail. • Familiarity with Agile Software Development practices and working in an agile environment • Strong analytical, troubleshooting and organizational skills • Ability to analyse and troubleshoot complex issues, and proficiency in multitasking • Ability to navigate ambiguity is great • BS degree in Computer Science, Math, Statistics or equivalent academic credentials .

Posted 2 months ago

Apply

13 - 20 years

19 - 34 Lacs

Pune, Chennai, Mumbai (All Areas)

Work from Office

Role: Enterprise Architect / Technology Advisors Required Technical Skill Set: Digital Transformation, Architecture Assessment & Definition Experience: 15 to 20 years Location: PAN INDIA Desired Competencies (Technical/Behavioral Competency): Must-Have: 1. Digital Architecture strong experience / knowledge on Digital architecture (Cloud native, Data Platforms etc), Industry 4.0 architectures / IOT, hybrid landscapes architectures, composable architectures 2. IT Architecture & Transformation - Understands organization's current and future technology needs. Can plan, coordinate, and implement technological improvements. Can create architecture principles and standards. Ability to define the architectures for Mid-large enterprise manufacturing customers 3. Distributed Systems Design - Understands in-depth Distributed Systems Design. Can guide others in evaluating differing design options 4. Research And Analysis Processes - In-depth understanding of the process of research and analysis. Can identify the shortcomings of optional solutions. Good-to-Have: 1. Understanding of Plants, Supply chain and Enterprise systems 2. Technology and architecture thought leadership 3. Strong Communication and Proactiveness Responsibility of / Expectations from the Role: Responsible for the assessment and definition of technology direction which aligns with IOT Connected strategy in order to meet complex customer requirements. Anchor the transformation program from Business requirement oversight, Technology strategy and architecture and customer stakeholder aligned Key Stakeholder mapping in business and IT Responsible for linkage between enterprises planning and architecture current and future state visions via technology road maps Understanding the market and customer directions and develop internal strategies for specific areas / technologies to be planned in future Work with other specialists' teams and anchor key initiatives for units such as Digital Twin, industry specific Reference Architectures etc.

Posted 2 months ago

Apply

4 - 9 years

0 - 0 Lacs

Bengaluru

Work from Office

Data Engineer Location - Bangalore We are looking for a skilled and motivated Data Engineer II to join our growing data team. In this role, you will design, build, and maintain scalable data pipelines and infrastructure to support data-driven decision making across the organization. You will work closely with data scientists, analysts, and other engineers to ensure the availability, reliability, and performance of our data systems. The ideal candidate has a strong foundation in data engineering principles, hands-on experience with modern data technologies, and a passion for solving complex data challenges. Key Responsibilities Work with Product Owners and various stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions and design the scale out architecture for data platform to meet the requirements of the proposed solution. Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques, and business strategies. Play an active role in leading team meetings and workshops with clients. Help the Data Engineering team produce high-quality code that allows us to put solutions into production • Create and own the technical product backlogs for data projects, help the team to close the backlogs in right time. Help us to shape the next generation of our products. Assess the effectiveness and accuracy of new data sources and data gathering techniques. • Lead data mining and collection procedures Ensure data quality and integrity Interpret and analyze data problems Develop custom data models and algorithms to apply to data set Coordinate with different functional teams to implement models and monitor outcomes • Develop processes and tools to monitor and analyze model performance and data accuracy • Responsible to understand the client requirement and architect robust data platform on multiple cloud technologies. Responsible for creating reusable and scalable data pipelines Work with DE/DA/ETL/QA/Application and various other teams to remove roadblocks • Align data projects with organizational goals. Skills & Qualifications Were looking for someone with 4-7 years of experience having worked through large data engineering projects Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. Strong problem-solving skills with an emphasis on product development Domain - Big Data, Data Platform, Distributed Systems Coding - any language (Java/scala/python) (most import requirement) with strong knowledge of Spark Ingestion skills - one of apache storm, flink, spark Streaming skills - one of kafka, kinesis, oplogs, binlogs, debizium Database skills – HDFS, Delta Lake/Iceberg, Lakehouse

Posted 2 months ago

Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies