Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 6.0 years
9 - 14 Lacs
Mumbai
Work from Office
Role Overview : We are looking for aTalend Data Catalog Specialistto drive enterprise data governance initiatives by implementingTalend Data Catalogand integrating it withApache Atlasfor unified metadata management within a Cloudera-based data lakehouse. The role involves establishing metadata lineage, glossary harmonization, and governance policies to enhance trust, discovery, and compliance across the data ecosystem Key Responsibilities: o Set up and configure Talend Data Catalog to ingest and manage metadata from source systems, data lake (HDFS), Iceberg tables, Hive metastore, and external data sources. o Develop and maintain business glossaries , data classifications, and metadata models. o Design and implement bi-directional integration between Talend Data Catalog and Apache Atlas to enable metadata synchronization , lineage capture, and policy alignment across the Cloudera stack. o Map technical metadata from Hive/Impala to business metadata defined in Talend. o Capture end-to-end lineage of data pipelines (e.g., from ingestion in PySpark to consumption in BI tools) using Talend and Atlas. o Provide impact analysis for schema changes, data transformations, and governance rule enforcement. o Support definition and rollout of enterprise data governance policies (e.g., ownership, stewardship, access control). o Enable role-based metadata access , tagging, and data sensitivity classification. o Work with data owners, stewards, and architects to ensure data assets are well-documented, governed, and discoverable. o Provide training to users on leveraging the catalog for search, understanding, and reuse. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6–12 years in data governance or metadata management, with at least 2–3 years in Talend Data Catalog. Talend Data Catalog, Apache Atlas, Cloudera CDP, Hive/Impala, Spark, HDFS, SQL. Business glossary, metadata enrichment, lineage tracking, stewardship workflows. Hands-on experience in Talend–Atlas integration , either through REST APIs, Kafka hooks, or metadata bridges. Preferred technical and professional experience .
Posted 3 weeks ago
3.0 - 7.0 years
6 - 10 Lacs
Mumbai
Work from Office
Role Overview : Looking for a Kafka SME to design and support real-time data ingestion pipelines using Kafka within a Cloudera-based Lakehouse architecture. Key Responsibilities : Design Kafka topics, partitions, schema registry Implement producer-consumer apps using Spark Structured Streaming Set up Kafka Connect, monitoring, and alerts Ensure secure, scalable message delivery Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required : Deep understanding of Kafka internals and ecosystem Integration with Cloudera and NiFi Schema evolution and serialization (Avro, Parquet) Performance tuning and fault-tolerance Preferred technical and professional experience Good communication skill. India market experience is preferred.
Posted 3 weeks ago
3.0 - 5.0 years
8 - 12 Lacs
Gurugram, Delhi
Work from Office
Role Description This is a full-time hybrid role for an Apache Nifi Developer based in Gurugram with some work-from-home options. The Apache Nifi Developer will be responsible for designing, developing, and maintaining data workflows and pipelines. The role includes programming, implementing backend web development solutions, using object-oriented programming (OOP) principles, and collaborating with team members to enhance software solutions. Qualifications Knowledge of Apache Nifi and experience in programming Skills in Back-End Web Development and Software Development Data Pipeline Strong understanding of APACHE NIFI Background in Computer Science Excellent problem-solving and analytical skills Ability to work in a hybrid environment Experience in AI and Blockchain is a plus Bachelor's degree in Computer Science or related field
Posted 3 weeks ago
2.0 - 4.0 years
3 - 8 Lacs
Coimbatore
Work from Office
We are looking for an experienced ETL Developer to join our team. The ideal candidate will have strong experience in ETL tools and processes, particularly with Talend, Informatica, Apache Nifi, Pentaho, or SSIS. The role requires excellent technical knowledge of databases, particularly MySQL, and a strong ability to integrate data from multiple sources. The candidate must also have strong manual testing skills and experience using version control systems such as GIT, along with project tracking tools like JIRA. Key Responsibilities: Design, develop, and implement ETL processes using tools like Talend, Informatica, Apache Nifi, Pentaho, or SSIS. Develop and maintain data pipelines to integrate data from various sources including APIs, cloud storage, and third-party applications. Perform data mapping, data transformation, and data cleansing to ensure data quality. Write complex SQL queries for data extraction, transformation, and loading from MySQL databases. Collaborate with cross-functional teams to understand data requirements and provide scalable solutions. Conduct manual testing to ensure the accuracy and performance of ETL processes and data. Manage version control with GIT, and track project progress in JIRA. Troubleshoot and resolve issues related to ETL processes, data integration, and testing. Ensure adherence to best practices for ETL design, testing, and documentation. Required Skills: 2.5 to 4 years of experience in ETL development with tools such as Talend, Informatica, Apache Nifi, Pentaho, or SSIS. Strong hands-on experience with MySQL databases. Proven ability to integrate data from diverse sources (APIs, cloud storage, third-party apps). Solid manual testing experience, with a focus on ensuring data accuracy and process integrity. Familiarity with GIT for version control and JIRA for project management. Strong problem-solving skills with the ability to troubleshoot and resolve technical issues. Excellent communication skills and the ability to collaborate with cross-functional teams. Preferred Skills: Experience working with cloud platforms such as AWS, Azure, or GCP. Knowledge of automation frameworks for testing ETL processes. Qualifications: Bachelors degree in computer science, Information Technology, or a related field (or equivalent work experience).
Posted 3 weeks ago
12.0 - 22.0 years
25 - 40 Lacs
Bangalore Rural, Bengaluru
Work from Office
Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner
Posted 3 weeks ago
4.0 - 9.0 years
3 - 8 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Role & responsibilities Site Reliability Engineer Requirements: We are seeking a proactive and technically strong Site Reliability Engineer (SRE) to ensure the stability, performance, and scalability of our Data Engineering Platform. You will work on cutting-edge technologies including Cloudera Hadoop, Spark, Airflow, NiFi, and JOB DESCRIPTIONS 2 Kubernetesensuring high availability and driving automation to support massive-scale data workloads, especially in the telecom domain. Key Responsibilities • Ensure platform uptime and application health as per SLOs/KPIs • Monitor infrastructure and applications using ELK, Prometheus, Zabbix, etc. • Debug and resolve complex production issues, performing root cause analysis • Automate routine tasks and implement self-healing systems • Design and maintain dashboards, alerts, and operational playbooks • Participate in incident management, problem resolution, and RCA documentation • Own and update SOPs for repeatable processes • Collaborate with L3 and Product teams for deeper issue resolution • Support and guide L1 operations team • Conduct periodic system maintenance and performance tuning • Respond to user data requests and ensure timely resolution • Address and mitigate security vulnerabilities and compliance issues Technical Skillset • Hands-on with Spark, Hive, Cloudera Hadoop, Kafka, Ranger • Strong Linux fundamentals and scripting (Python, Shell) • Experience with Apache NiFi, Airflow, Yarn, and Zookeeper • Proficient in monitoring and observability tools: ELK Stack, Prometheus, Loki • Working knowledge of Kubernetes, Docker, Jenkins CI/CD pipelines • Strong SQL skills (Oracle/Exadata preferred) Job Description: • Familiarity with DataHub, DataMesh, and security best practices is a plus • Strong problem-solving and debugging mindset • Ability to work under pressure in a fast-paced environment. • Excellent communication and collaboration skills. • Ownership, customer orientation, and a bias for action Preferred candidate profile Immediate Joiner
Posted 3 weeks ago
2 - 5 years
2 - 5 Lacs
Bengaluru
Work from Office
Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings
Posted 4 weeks ago
5 - 8 years
10 - 14 Lacs
Chennai
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of this role is to provide solutions and bridge the gap between technology and business know-how to deliver any client solution ? Please find the below JD Exp5-8 Years Good understanding of DWH GCP(Google Cloud Platform) BigQuery knowledge Knowledge of GCP Storage GCP Workflows and Functions Python CDC Extractor Tools like(Qlik/Nifi) BI Knowledge(like Power BI or looker) ? 2. Skill upgradation and competency building Clear wipro exams and internal certifications from time to time to upgrade the skills Attend trainings, seminars to sharpen the knowledge in functional/ technical domain Write papers, articles, case studies and publish them on the intranet ? Deliver No. Performance Parameter Measure 1. Contribution to customer projects Quality, SLA, ETA, no. of tickets resolved, problem solved, # of change requests implemented, zero customer escalation, CSAT 2. Automation Process optimization, reduction in process/ steps, reduction in no. of tickets raised 3. Skill upgradation # of trainings & certifications completed, # of papers, articles written in a quarter ? Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
5 - 6 years
7 - 8 Lacs
Gurugram
Work from Office
Site Reliability Engineer Job Description: Requirements: We are seeking a proactive and technically strong Site Reliability Engineer (SRE) to ensure the stability, performance, and scalability of our Data Engineering Platform. You will work on cutting-edge technologies including Cloudera Hadoop, Spark, Airflow, NiFi, and Kubernetesensuring high availability and driving automation to support massive-scale data workloads, especially in the telecom domain. Key Responsibilities Ensure platform uptime and application health as per SLOs/KPIs Monitor infrastructure and applications using ELK, Prometheus, Zabbix, etc. Debug and resolve complex production issues, performing root cause analysis Automate routine tasks and implement self-healing systems Design and maintain dashboards, alerts, and operational playbooks Participate in incident management, problem resolution, and RCA documentation Own and update SOPs for repeatable processes Collaborate with L3 and Product teams for deeper issue resolution Support and guide L1 operations team Conduct periodic system maintenance and performance tuning Respond to user data requests and ensure timely resolution Address and mitigate security vulnerabilities and compliance issues Technical Skillset Hands-on with Spark, Hive, Cloudera Hadoop, Kafka, Ranger Strong Linux fundamentals and scripting (Python, Shell) Experience with Apache NiFi, Airflow, Yarn, and Zookeeper Proficient in monitoring and observability tools: ELK Stack, Prometheus, Loki Working knowledge of Kubernetes, Docker, Jenkins CI/CD pipelines Strong SQL skills (Oracle/Exadata preferred) Familiarity with DataHub, DataMesh, and security best practices is a plus Strong problem-solving and debugging mindset Ability to work under pressure in a fast-paced environment. Excellent communication and collaboration skills. Ownership, customer orientation, and a bias for action
Posted 1 month ago
5 - 6 years
7 - 8 Lacs
Gurugram
Work from Office
Site Reliability Engineer Job Description: Requirements: We are seeking a proactive and technically strong Site Reliability Engineer (SRE) to ensure the stability, performance, and scalability of our Data Engineering Platform. You will work on cutting-edge technologies including Cloudera Hadoop, Spark, Airflow, NiFi, and Kubernetesensuring high availability and driving automation to support massive-scale data workloads, especially in the telecom domain. Key Responsibilities Ensure platform uptime and application health as per SLOs/KPIs Monitor infrastructure and applications using ELK, Prometheus, Zabbix, etc. Debug and resolve complex production issues, performing root cause analysis Automate routine tasks and implement self-healing systems Design and maintain dashboards, alerts, and operational playbooks Participate in incident management, problem resolution, and RCA documentation Own and update SOPs for repeatable processes Collaborate with L3 and Product teams for deeper issue resolution Support and guide L1 operations team Conduct periodic system maintenance and performance tuning Respond to user data requests and ensure timely resolution Address and mitigate security vulnerabilities and compliance issues Technical Skillset Hands-on with Spark, Hive, Cloudera Hadoop, Kafka, Ranger Strong Linux fundamentals and scripting (Python, Shell) Experience with Apache NiFi, Airflow, Yarn, and Zookeeper Proficient in monitoring and observability tools: ELK Stack, Prometheus, Loki Working knowledge of Kubernetes, Docker, Jenkins CI/CD pipelines Strong SQL skills (Oracle/Exadata preferred) Familiarity with DataHub, DataMesh, and security best practices is a plus Strong problem-solving and debugging mindset Ability to work under pressure in a fast-paced environment. Excellent communication and collaboration skills. Ownership, customer orientation, and a bias for action
Posted 1 month ago
16 - 21 years
40 - 45 Lacs
Gurugram
Work from Office
The Role: Enterprise Architect - Integration The Team: The OSTTRA Technology team is composed of Capital Markets Technology professionals, who build, support and protect the applications that operate our network. The technology landscape includes high-performance, high-volume applications as well as compute intensive applications, leveraging contemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. Whats in it for you: The current objective is to identify individuals with 16+ years of experience who have high expertise, to join their existing team of experts who are spread across the world. This is your opportunity to start at the beginning and get the advantages of rapid early growth. This role is based out in Gurgaon and expected to work with different teams and colleagues across the globe. This is an excellent opportunity to be part of a team based out of Gurgaon and to work with colleagues across multiple regions globally. Responsibilities: The role shall be responsible for establishing, maintaining, socialising and realising the target state integration strategy for FX & Securities Post trade businesses of Osttra. This shall encompass the post trade lifecycle of our businesses including connectivity with clients, markets ecosystem and Osttras post trade family of networks and platforms and products. The role shall partner with product architects, product managers, delivery heads and teams for refactoring the deliveries towards the target state. They shall be responsible for the efficiency, optimisation, oversight and troubleshooting of current day integration solutions, platforms and deliveries as well, in addition target state focus. The role shall be expected to produce and maintain integration architecture blueprint. This shall cover current state and propose a rationalised view of target state of end-to-end integration flows and patterns. The role shall also provide for and enable the needed technology platforms/tools and engineering methods to realise the strategy. The role enable standardisation of protocols / formats (at least within Osttra world) , tools and reduce the duplication & non differentiated heavy lift in systems. The role shall enable the documentation of flows & capture of standard message models. Integration strategy shall also include transformation strategy which is so vital in a multi-lateral / party / system post trade world. Role shall partner with other architects and strategies / programmes and enable the demands of UI, application, and data strategies. What Were Looking For: Rich domain experience of financial services industry preferably with financial markets, Pre/post trade life cycles and large-scale Buy/Sell/Brokerage organisations Should have experience of leading the integration strategies and delivering the integration design and architecture for complex programmes and financial enterprises catering to key variances of latency / throughput. Experience with API Management platforms (like AWS API Gateway, Apigee, Kong, MuleSoft Anypoint) and key management concepts (API lifecycle management, versioning strategies, developer portals, rate limiting, policy enforcement) Should be adept with integration & transformation methods, technologies and tools. Should have experience of domain modelling for messages / events / streams and APIs. Rich experience of architectural patterns like Event driven architectures, micro services, event streaming, Message processing/orchestrations, CQRS, Event sourcing etc. Experience of protocols or integration technologies like FIX, Swift, MQ, FTP, API etc. .. including knowledge of authentication patterns (OAuth, mTLS, JWT, API Keys), authorization mechanisms, data encryption (in transit and at rest), secrets management, and security best practices Experience of messaging formats and paradigms like XSD, XML, XSLT, JSON, Protobuf, REST, gRPC, GraphQL etc Experience of technology like Kafka or AWS Kinesis, Spark streams, Kubernetes / EKS, AWS EMR Experience of languages like Java, python and message orchestration frameworks like Apache Camel, Apache Nifi, AWS Step Functions etc. Experience in designing and implementing traceability/observability strategies for integration systems and familiarity with relevant framework tooling. Experience of engineering methods like CI/CD, build deploy automation, Infra as code and integration testing methods and tools Should have appetite to review / code for complex problems and should find interests / energy in doing design discussions and reviews. Experience and strong understanding of multicloud integration patterns.
Posted 1 month ago
4 - 8 years
25 - 30 Lacs
Pune
Hybrid
So, what’s t he r ole all about? As a Data Engineer, you will be responsible for designing, building, and maintaining large-scale data systems, as well as working with cross-functional teams to ensure efficient data processing and integration. You will leverage your knowledge of Apache Spark to create robust ETL processes, optimize data workflows, and manage high volumes of structured and unstructured data. How will you make an impact? Design, implement, and maintain data pipelines using Apache Spark for processing large datasets. Work with data engineering teams to optimize data workflows for performance and scalability. Integrate data from various sources, ensuring clean, reliable, and high-quality data for analysis. Develop and maintain data models, databases, and data lakes. Build and manage scalable ETL solutions to support business intelligence and data science initiatives. Monitor and troubleshoot data processing jobs, ensuring they run efficiently and effectively. Collaborate with data scientists, analysts, and other stakeholders to understand business needs and deliver data solutions. Implement data security best practices to protect sensitive information. Maintain a high level of data quality and ensure timely delivery of data to end-users. Continuously evaluate new technologies and frameworks to improve data engineering processes. Have you got what it takes? 4-7 years of experience as a Data Engineer, with a strong focus on Apache Spark and big data technologies. Expertise in Spark SQL , DataFrames , and RDDs for data processing and analysis. Proficient in programming languages such as Python , Scala , or Java for data engineering tasks. Hands-on experience with cloud platforms like AWS , specifically with data processing and storage services (e.g., S3 , BigQuery , Redshift , Databricks ). Experience with ETL frameworks and tools such as Apache Kafka , Airflow , or NiFi . Strong knowledge of data warehousing concepts and technologies (e.g., Redshift , Snowflake , BigQuery ). Familiarity with containerization technologies like Docker and Kubernetes . Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. Excellent communication and collaboration skills to work effectively with cross-functional teams. You will have an advantage if you also have: Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7235 Reporting into: Tech Manager Role Type: Individual Contributor
Posted 1 month ago
5 - 10 years
15 - 30 Lacs
Hyderabad
Work from Office
What is Blend Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com What is the Role? We are seeking a highly skilled Lead Data Engineer to join our data engineering team for an on-premise environment. A large portion of your time will be in the weeds working alongside your team architecture, designing, implementing, and optimizing data solutions. The ideal candidate will have extensive experience in building and optimizing data pipelines, architectures, and data sets, with a strong focus on Python, SQL, Hadoop, HDFS, and Apache NiFi. What youll be doing? Design, develop, and maintain robust, scalable, and high-performance data pipelines and data integration solutions. Manage and optimize data storage in Hadoop Distributed File System (HDFS). Design and implement data workflows using Apache NiFi for data ingestion, transformation, and distribution. Collaborate with cross-functional teams to understand data requirements and deliver efficient solutions. Ensure data quality, governance, and security standards are met within the on-premise infrastructure. Monitor and troubleshoot data pipelines to ensure optimal performance and reliability. Automate data workflows and processes to enhance system efficiency. What do we need from you? Bachelor’s degree in computer science, Software Engineering, or a related field 6+ years of experience in data engineering or a related field Strong programming skills in Python and SQL. Hands-on experience with Hadoop ecosystem (HDFS, Hive, etc.). Proficiency in Apache NiFi for data ingestion and flow orchestration. Experience in data modeling, ETL development, and data warehousing concepts. Strong problem-solving skills and ability to work independently in a fast-paced environment. Good understanding of data governance, data security, and best practices in on-premise environments. What do you get in return? Competitive Salary: Your skills and contributions are highly valued here, and we make sure your salary reflects that, rewarding you fairly for the knowledge and experience you bring to the table. Dynamic Career Growth: Our vibrant environment offers you the opportunity to grow rapidly, providing the right tools, mentorship, and experiences to fast-track your career. Idea Tanks : Innovation lives here. Our "Idea Tanks" are your playground to pitch, experiment, and collaborate on ideas that can shape the future. Growth Chats : Dive into our casual "Growth Chats" where you can learn from the best whether it's over lunch or during a laid-back session with peers, it's the perfect space to grow your skills. Snack Zone: Stay fueled and inspired! In our Snack Zone, you'll find a variety of snacks to keep your energy high and ideas flowing. Recognition & Rewards : We believe great work deserves to be recognized. Expect regular Hive-Fives, shoutouts and the chance to see your ideas come to life as part of our reward program. Fuel Your Growth Journey with Certifications: We’re all about your growth groove! Level up your skills with our support as we cover the cost of your certifications .
Posted 1 month ago
2 - 4 years
12 - 22 Lacs
Bengaluru
Work from Office
About Lowes Lowes Companies, Inc. (NYSE: LOW) is a FORTUNE 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowes operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowes supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Job Summary The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver code modules, stable application systems, and software solutions, maintain them and monitor. This includes developing, configuring, modifying, maintaining, monitoring integrated business and/or enterprise application solutions within various computing environments. This role facilitates the implementation and maintenance of business and enterprise software solutions to ensure successful deployment of released applications. Roles & Responsibilities Core Responsibilities 1. Design and develop applications, this will include both backend and frontend development. 2. Development and maintenance of microservices, standalone applications, libraries etc. 3. Will have to work on cloud platform which includes development, deployment and monitoring. 4. Will have to work on database when needed 5. Should be ready to give all the support that the application needs once its in production, including being on call during the week or weekend as needed by the project. 6. Debug production issues and come up with multiple solutions and have the ability to choose the best possible solution 7.Should be well versed with documentation (including different UML diagrams) 8. Ready to work in hybrid model (scrum + Kanban) 9. Should focus on quality and time to market. 10. Should be very proactive and ready to work on any given task 11. Must do multitasking and should be quick to adopt the changes in business requirement. 12. Should be able to provide out of the box solutions for a problem. 13. Should be able to communicate effectively within the and outside team. 14. Should be aligned with the team and be a good team player. Years of Experience Minimum 2+ years experience in Software Development and Maintenance (SDLC) Education Qualification & Certifications • Bachelor's/masters degree in computer science, CIS, or related field (or equivalent work experience in a related field). • Minimum of 2+ years of experience in software development and maintenance. • Minimum of 2+ years of experience in database technologies. • Minimum of 2+ years of experience working with defect or incident tracking software • Minimum of 2+ year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC). • Minimum of 2+ years of experience with technical documentation in a software development environment • Working experience with application coding/testing/debugging and networking • Working experience with database technologies. Primary Skills (Must Have) 1.Java, Reactive programming, Spring boot, Node JS, Microservices. 2.Apache Pyspark, Python 3.Data Pipeline using Apache NiFi Framework. 4.PostGres Database, SQL,JPA 5.Cloud based development and deployments, Kubernetes, Docker, Prometius, 6.Basic network configuration knowledge, linux, different application servers, 7.GIT, Bitbucket, Splunk, Kibana, JIRA, Confluence. Secondary Skills (Desired) 1.Working experience on front end REACT technologies, GCP, Jenkins, linux scripting. 2.Any certification is an added advantage.
Posted 1 month ago
7 - 9 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : O9 Solutions Good to have skills : NA Minimum 7 year(s) of experience is required Educational Qualification : BE/BTech/MCA/Bachelor's degree/master's degree in computer science and related fields of work are preferred. Role:Application Developer Project Role Description:Design, build and configure applications to meet business process and application requirements. Must have Skills:[object Object] Good to Have Skills:[object Object] Job Requirements:'',//?field Key Responsibilities:Play the integration consultant role on o9 implementation projects. Understand o9 platform's data model (table structures, linkages, optimal designs) for designing various planning use cases. Review and analyze the data provided by customer along with its technical/functional intent and inter-dependencies. Participate in the technical design, data requirements gathering, making recommendations in case of inaccurate or missing data. Work on designing and creating batch schedules based on frequency and configuration settings for daily/weekly/quarterly/yearly batches. E2E integration implementation from partner system to O9 platform Technical Experience:Minimum 3 to 7 years of experience in SQL/PLSQL, SSIS. Proficiency in databases (SQL Server, MySQL) knowledge of DDL, DML, stored procedures, SSMS, o9 DB designer, o9 Batch Orchestrator. At least one E2E integration implementation from partner system to o9 will be preferred. Any API based integration experience will be added advantage. Good to have experience in Kafka, Nifi ,PySpark, Python Professional Attributes:Proven ability to work creatively and analytically in a problem-solving environment Proven ability to build, manage and foster a team-oriented environment Excellent problem-solving skills with excellent communication written/oral, interpersonal skills Strong collaborator- team player- and IC Educational Qualification:BE/BTech/MCA/Bachelor's degree/master's degree in computer science and related fields of work are preferred. Additional Info:Open to travel - short / long term Qualification BE/BTech/MCA/Bachelor's degree/master's degree in computer science and related fields of work are preferred.
Posted 1 month ago
3 - 7 years
9 - 13 Lacs
Hyderabad
Work from Office
About The Role #body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{color:rgb(0,0,0) !important;}#body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{background:rgba(230,231,232,1.0) !important;} Apply now Senior Software Developer Job Location (Short): Hyderabad, India Workplace Type: Hybrid Business Unit: ALI Req Id: 1492 .buttontextb0d7f9bdde9da229 a{ border1px solid transparent; } .buttontextb0d7f9bdde9da229 a:focus{ border1px dashed #5B94FF !important; outlinenone !important; } Responsibilities Our product team is responsible for designing, developing, testing, deploying, and managing HxGN Databridge Pro (DBPro) product. DBPro powered by Apache NiFi, provides a cloud-based, multi-tenant platform to build integrations for HxGN EAM customer engagements, integration to Xalt and other Hexagon products. Hexagon is seeking a highly motivated software developer to join the team and works on the product implementations of new features. This person is responsible for designing, writing, executing, testing, and maintaining the application. They will be working with a team of developers along with architects and QA analyst to ensure the quality of the product. Our application is a multi-tenant AWS cloud-based application. This is an exciting time for our team as our application is being used as the company standard product for moving data between Hexagon applications. You will contribute by partnering with solution architects and implementation teams to ensure clean code is released for production. A Day in The Life Typically Includes: Collaborate with manager, business analyst and other developers to clarify and finalize requirements and produce corresponding functional specifications for general applications and infrastructure Work with other software developers and architects to design and implement applications using Java code and enhancements as needed Maintain and enhance applications on an ongoing basis per user/customer feedback Ensure that unit and system tests are automated, per quality assurance requirements Collaborate as necessary to define and implement regression test suites Optimize performance and scalability as necessary to meet business goals of application and environment Works under limited supervision May be required to work extended hours to meet project timelines as needed Education / Qualifications Bachelor of Science in Computer Science or equivalent work experience Minimum of 3 years of Java coding experience for technologies in a fast-paced environment Strong object-oriented software systems design and architectural skills Experience in the following areas. Experience with JDK 1.8 and up (Java 11 preferred), SpringBoot, Maven, Git, REST API principles, JSON, and mapping frameworks Experience and understanding in designing and developing software while applying design patterns and object-oriented principles Experience in unit testing – Junit, assertion and mocking frameworks Knowledge of Angular 1.x, JavaScript, HTML , CSS, and JQuery Experience using Agile development methodologies. Experience with all phases of the software development life cycle Exposure and working knowledge of the following areas. Configuration Management tools such as Git and Maven Docker containers Works with limited supervision Flexibility and willingness to pitch in where needed. Ability to deliver results, prioritize activities, and to manage time effectively Communicates in English effectively (both written and verbally) What Will Put You Ahead / Preferred Qualifications: Experience in working and testing enterprise web applications in a cloud environment such as AWS Experience in databases technologies and writing optimum queries Experience working with Angular 1.x, JavaScript , HTML , CSS, and JQuery Knowledge of Kubernetes Experience using Terraform for deployments Experience writing Python scripts #LI-VBP1#LI-Hybrid About Hexagon Hexagon is a global leader in digital reality solutions, combining sensor, software and autonomous technologies. We are putting data to work to boost efficiency, productivity, quality and safety across industrial, manufacturing, infrastructure, public sector, and mobility applications. Hexagon’s Asset Lifecycle Intelligence division helps clients design, construct, and operate more profitable, safe, and sustainable industrial facilities. We empower customers to unlock data, accelerate industrial project modernization and digital maturity, increase productivity, and move the sustainability needle. Our technologies help produce actionable insights that enable better decision-making and intelligence across the asset lifecycle of industrial projects, leading to improvements in safety, quality, efficiency, and productivity, which contribute to Economic and Environmental Sustainability. Hexagon (Nasdaq StockholmHEXA B) has approximately 25,000 employees in 50 countries and net sales of approximately 5.4bn EUR. Learn more at hexagon.com and follow us @HexagonAB. Why work for Hexagon? At Hexagon, if you can see it, you can do it. Hexagon’s Asset Lifecyle Intelligence division puts their trust in you so that you can bring your ideas to life. We have emerged as one of the most engaged and enabled workplaces*. We are committed to creating an environment that is truly supportive by providing the resources you need to fully support your ambitions, no matter who you are or where you are in the world. * In the recently concluded workplace effectiveness survey by Korn Ferry, a global HR advisory firm, Hexagon, Asset Lifecycle Intelligence division has emerged as one of the most Engaged and Enabled workplaces, when compared to similar organizations that Korn Ferry partners with. Everyone is welcome At Hexagon, we believe that diverse and inclusive teams are critical to the success of our people and our business. Everyone is welcome—as an inclusive workplace, we do not discriminate. In fact, we embrace differences and are fully committed to creating equal opportunities, an inclusive environment, and fairness for all. Respect is the cornerstone of how we operate, so speak up and be yourself. You are valued here. .buttontext1c1d8f096aaf95bf a{ border1px solid transparent; } .buttontext1c1d8f096aaf95bf a:focus{ border1px dashed #0097ba !important; outlinenone !important; } #body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{color:rgb(0,0,0) !important;}#body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{background:rgba(230,231,232,1.0) !important;} Apply now
Posted 1 month ago
3 - 7 years
9 - 13 Lacs
Hyderabad
Work from Office
About The Role #body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{color:rgb(0,0,0) !important;}#body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{background:rgba(230,231,232,1.0) !important;} Apply now Senior Software Developer Job Location (Short): Hyderabad, India Workplace Type: Hybrid Business Unit: ALI Req Id: 1494 .buttontextb0d7f9bdde9da229 a{ border1px solid transparent; } .buttontextb0d7f9bdde9da229 a:focus{ border1px dashed #5B94FF !important; outlinenone !important; } Responsibilities Our product team is responsible for designing, developing, testing, deploying, and managing HxGN Databridge Pro (DBPro) product. DBPro powered by Apache NiFi, provides a cloud-based, multi-tenant platform to build integrations for HxGN EAM customer engagements, integration to Xalt and other Hexagon products. Hexagon is seeking a highly motivated software developer to join the team and works on the product implementations of new features. This person is responsible for designing, writing, executing, testing, and maintaining the application. They will be working with a team of developers along with architects and QA analyst to ensure the quality of the product. Our application is a multi-tenant AWS cloud-based application. This is an exciting time for our team as our application is being used as the company standard product for moving data between Hexagon applications. You will contribute by partnering with solution architects and implementation teams to ensure clean code is released for production. A Day in The Life Typically Includes: Collaborate with manager, business analyst and other developers to clarify and finalize requirements and produce corresponding functional specifications for general applications and infrastructure Work with other software developers and architects to design and implement applications using Javascript code and enhancements as needed Maintain and enhance applications on an ongoing basis per user/customer feedback Ensure that unit and system tests are automated, per quality assurance requirements Collaborate as necessary to define and implement regression test suites Optimize performance and scalability as necessary to meet business goals of application and environment Works under limited supervision May be required to work extended hours to meet project timelines as needed Education / Qualifications Bachelor of Science in Computer Science or equivalent work experience Minimum of 3 years of HTML, CSS, JavaScript, TypeScript, Angular, AngularJS and Bootstrap coding experience Experience AJAX, Maven, Git, REST API, JSON Exposure and working knowledge of the following areas. Configuration Management tools such as Git, Maven, and Docker containers Good to have Experience working with Java, Spring Boot Knowledge of Kubernetes Experience using Terraform for deployments About Hexagon Hexagon is a global leader in digital reality solutions, combining sensor, software and autonomous technologies. We are putting data to work to boost efficiency, productivity, quality and safety across industrial, manufacturing, infrastructure, public sector, and mobility applications. Hexagon’s Asset Lifecycle Intelligence division helps clients design, construct, and operate more profitable, safe, and sustainable industrial facilities. We empower customers to unlock data, accelerate industrial project modernization and digital maturity, increase productivity, and move the sustainability needle. Our technologies help produce actionable insights that enable better decision-making and intelligence across the asset lifecycle of industrial projects, leading to improvements in safety, quality, efficiency, and productivity, which contribute to Economic and Environmental Sustainability. Hexagon (Nasdaq StockholmHEXA B) has approximately 25,000 employees in 50 countries and net sales of approximately 5.4bn EUR. Learn more at hexagon.com and follow us @HexagonAB. Why work for Hexagon? At Hexagon, if you can see it, you can do it. Hexagon’s Asset Lifecyle Intelligence division puts their trust in you so that you can bring your ideas to life. We have emerged as one of the most engaged and enabled workplaces*. We are committed to creating an environment that is truly supportive by providing the resources you need to fully support your ambitions, no matter who you are or where you are in the world. * In the recently concluded workplace effectiveness survey by Korn Ferry, a global HR advisory firm, Hexagon, Asset Lifecycle Intelligence division has emerged as one of the most Engaged and Enabled workplaces, when compared to similar organizations that Korn Ferry partners with. Everyone is welcome At Hexagon, we believe that diverse and inclusive teams are critical to the success of our people and our business. Everyone is welcome—as an inclusive workplace, we do not discriminate. In fact, we embrace differences and are fully committed to creating equal opportunities, an inclusive environment, and fairness for all. Respect is the cornerstone of how we operate, so speak up and be yourself. You are valued here. .buttontext1c1d8f096aaf95bf a{ border1px solid transparent; } .buttontext1c1d8f096aaf95bf a:focus{ border1px dashed #0097ba !important; outlinenone !important; } #body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{color:rgb(0,0,0) !important;}#body.unify div.unify-button-container .unify-apply-now:focus, #body.unify div.unify-button-container .unify-apply-now:hover{background:rgba(230,231,232,1.0) !important;} Apply now
Posted 1 month ago
11 - 13 years
20 - 30 Lacs
Pune, Chennai, Bengaluru
Hybrid
Experience : 11+ yrs - 14yrs Work location : Pune/Bangalore/Chennai Work Mode : Hybrid Notice Period : Imm - 45 days Core Skills: 1 . Strong proficiency in Python or another programming language commonly used in data engineering, such as Java. 2. Excellent SQL skills with ability to work with data across different SQL databases including Postgres, SQL Server etc. NoSQL databases is a plus 3. Knowledge of elastic Search/log stash implementation skills very desirable 4. Experience in building and optimizing ETL/ELT processes, data pipelines, and workflows, using tools like Apache Nifi, or Apache Kafka. 5. Familiarity with data modeling, data warehousing concepts, and data governance best practices. 6. Strong problem-solving and troubleshooting skills, with the ability to identify and resolve data quality and performance issues. 7. Experience with version control systems, such as Git, and knowledge of CI/CD practices in a data engineering context. 8. Knowledge of data security and privacy principles, as well as experience implementing data access controls and managing sensitive data. 9. Understanding of distributed computing principles and experience with distributed data processing frameworks like Apache Spark or Hadoop. 10. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. 11. Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams and translate business requirements into technical solutions. 12. Mentoring more junior resources to keep our projects on track.
Posted 1 month ago
5 - 10 years
15 - 30 Lacs
Chennai, Thiruvananthapuram
Hybrid
Develop and maintain Java-based applications and microservices.Design and implement data flows using Apache NiFi for real-time and batch data processing.Create documentation for design, development, and support processes.
Posted 1 month ago
3 - 4 years
5 - 7 Lacs
Bengaluru
Work from Office
About the Role: The Team: The Customer Experience (CX) organization is a world-class customer-oriented team leading performance growth and optimization from the heart of the cross functional Organization. Client Strategy & Analytics team is part of S&P Global Market Intelligences CX organization. The Analytics team within MI supporting the CCO organization is dedicated to driving impactful insights and analytics solutions that support our top customers' revenue growth and operational efficiency. We value innovation, collaboration, and a data-driven approach to decision-making. Our team is committed to fostering an inclusive environment where diverse perspectives are embraced, and continuous learning is encouraged. We are looking for a senior analyst with 3 to 4 years of data engineering experience to help design, develop, and maintain robust data pipelines and systems for processing large-scale data. Responsibilities: The role involves experience in data modeling, data schemas and good understanding data and query optimization, query profiling, and query performance and techniques. We are looking for an exceptional analytics professional to develop analytics solutions and maintain and enrich existing tools. Collaborate with cross-functional teams to understand business, data quality, document gaps and build solutions to tap on next customer opportunities or proactively minimize risks. Present findings and recommendations to senior management. Additionally, the professional would also be involved in quick turnaround ad-hoc analytics projects. You will be interacting with global stakeholders to understand their needs, improvement ways and always carry a listening attitude to business problems. You are encouraged to drive next generation analytics on unstructured data. Mentoring to junior analyst and close collaboration with peers are expected. What Were Looking For: Proficient in programming languages such as Python, or R. Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL). Basic Knowledge of cloud platforms (AWS, Azure, or GCP) and associated data storage services (e.g., S3, BigQuery, Redshift). Familiarity with ETL/ELT tools and data orchestration platforms (e.g., Airflow, Apache NiFi). Proven record of building interactive dashboards in Power BI. Exposed to advanced usage of DAX language. Positive, proactive attitude and ability to work well in teams. Exceptional skills in listening to leaders & cross functional heads, articulating ideas and complex information in a clear and concise manner. Good time management and communication skills Basic Qualifications: B.E. or B. Tech or B.Sc. (Statistics or Mathematics) or B. Com or B.A. Economics Excellent skills in MS-Excel & Power BI Advanced skills in Programming Languages Basic skills in MS-Word, & MS-PowerPoint Preferred Qualifications: 3 to 4 years work experience with analytics solutions Exposure to building analytics solutions and maintenance of the same from scratch with minimal supervision.
Posted 2 months ago
5 - 9 years
16 - 31 Lacs
Hyderabad
Hybrid
Role & responsibilities Understand the Business Problem and the Relevant Data • Maintain an intimate understanding of company and department strategy • Translate analysis requirements into data requirements • Identify and understand the data sources that are relevant to the business problem • Develop conceptual models that capture the relationships within the data • Define the data-quality objectives for the solution • Be a subject matter expert in data sources and reporting options Architect Data Management Systems • Leverage understanding of the business problem and the nature of the data to select appropriate data management system (Big Data, OLTP, OLAP, etc.) • Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements • Plan methods for archiving/deletion of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data • Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static) • Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model • Develop processes to efficiently load the transform data into the data management system Prepare Data to Meet Analysis Requirements • Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) • Develop and code data extracts • Follow best practices to ensure data quality and data integrity • Ensure that the data is fit to use for data science applications Preferred candidate profile 5+ years developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions • Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) • Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake • Experienced in Cloud based solutions using AWS/AZURE/GCP. • Significant experience with big data processing and/or developing applications and data sources via Spark, etc. • Understanding of how distributed systems work • Familiarity with software architecture (data structures, data schemas, etc.) • Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. • Strong mathematics background, analytical, problem solving, and organizational skills • Strong communication skills (written, verbal and presentation) • Experience working in a global, cross-functional environment • Minimum of 2 years experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) • Software development Ability to travel as needed
Posted 2 months ago
10 - 14 years
35 - 40 Lacs
Hyderabad
Work from Office
Responsibilities 1. Integration Strategy & Architecture Define the enterprise integration strategy , aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. 2. Technology Selection & Implementation Evaluate and recommend the right integration technologies , such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft ETL & Data Integration: Informatica, Talend, Apache NiFi iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. 3. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards ( OAuth, JWT, OpenID Connect, API Gateway ). Establish API versioning, governance, and lifecycle management. 4. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. 5. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications . Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday . 6. Security & Compliance Ensure secure integration practices , including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations ( GDPR, HIPAA, SOC 2 ). 7. Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. 8. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration. Provide technical guidance to developers, architects, and integration engineers. Qualifications Technical Skills Candidate should have 10+ years of experience Expertise in Integration Platforms: Informatica, TIBCO, MuleSoft, WSO2, Dell Boomi Strong understanding of API Management & Microservices Experience with Enterprise Messaging & Streaming (Kafka, RabbitMQ, IBM MQ, Azure Event Hub) Knowledge of ETL & Data Pipelines (Informatica, Talend, Apache NiFi, AWS Glue) Experience in Cloud & Hybrid Integration (AWS, Azure, GCP, OCI) Hands-on with Security & Compliance (OAuth2, JWT, SAML, API Security, Zero Trust) Soft Skills Strategic Thinking & Architecture Design Problem-solving & Troubleshooting Collaboration & Stakeholder Management Agility in Digital Transformation & Cloud Migration.
Posted 2 months ago
4 - 7 years
7 - 11 Lacs
Maharashtra
Work from Office
Sound Telecom Assurance Domain experience for Mobile and Fixed technology Must have hands on exp of NAC or similar Assurance COTs like Nokia NAC Assurance, Service Now etc. 4+ exp onJava, Python, Apache Nifi, Graphana, Cloud Platform, microservices, Neo4J Database scripts, Rest APIs, SPOG dashboard. Have strong exp. on Assurance process like Service Impact analysis, Close Loop Assurance, Automation, Incident creating, Incident process, Anomaly detection, and complete e2e Assurance process
Posted 2 months ago
3 - 5 years
8 - 12 Lacs
Gurgaon, Delhi
Work from Office
Role Description This is a full-time hybrid role for an Apache Nifi Developer based in Gurugram with some work-from-home options. The Apache Nifi Developer will be responsible for designing, developing, and maintaining data workflows and pipelines. The role includes programming, implementing backend web development solutions, using object-oriented programming (OOP) principles, and collaborating with team members to enhance software solutions. Qualifications Knowledge of Apache Nifi and experience in programming Skills in Back-End Web Development and Software Development Data Pipeline Strong understanding of APACHE NIFI Background in Computer Science Excellent problem-solving and analytical skills Ability to work in a hybrid environment Experience in AI and Blockchain is a plus Bachelor's degree in Computer Science or related field
Posted 2 months ago
5 - 9 years
18 - 20 Lacs
Navi Mumbai
Work from Office
Position Overview: We are looking for a skilled and visionary Data Engineering Lead to join our growing team. In this role, you will be responsible for leading a team of data engineers in designing, developing, and maintaining robust data pipelines and infrastructure. You will work closely with cross-functional teams to support data-driven decision-making and ensure the availability, quality, and integrity of our data assets. Role & responsibilities: Build, develop, and maintain efficient and high-performance data pipelines across both cloud and on-premises environments. Ensure the accuracy, adequacy, and legitimacy of data. Prepare ETL pipelines to extract data from various sources and store it in a centralized location. • Analyse, interpret, and present results through effective visualization and reports. Identify critical metrics. Implement and instill best practices for effective data management. Monitor the use of data systems and ensure the correctness, completeness, and availability of data services. Optimize data infrastructure and processes for cost efficiency on AWS cloud and on-premises environments. Utilize Apache Airflow, NiFi, or equivalent tools to build and manage data workflows and integrations. Implement best practices for data governance, security, and compliance. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Stay current with industry trends and emerging technologies in data engineering and cloud computing. Lead and mentor a team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement. Define team objectives and performance metrics, conducting regular performance evaluations and providing constructive feedback. Facilitate knowledge sharing and professional development within the team. Preferred candidate profile: Up to 6-9 years of proven experience as a Data Engineering Lead or in a similar role. Extensive hands-on experience with ETL and ELT processes. Strong expertise in data integrity and quality assurance. Proficiency in optimizing AWS cloud services and on-premises infrastructure for cost and performance. Hands-on experience with Apache Airflow and NiFi. Strong programming skills in languages such as Python, Java, or Scala. Experience with SQL and NoSQL databases. Experience in building and maintaining a single source of truth. Familiarity with data warehousing solutions like Amazon Redshift, Snowflake, or BigQuery. Strong problem-solving skills and the ability to work under pressure. Hands-on experience with data visualization tools such as Tableau, Power BI, or Looker. Experience in financial services is a must. Skills Required: Team leading and team management. Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. Ability to work collaboratively in a fast-paced environment and manage multiple priorities. Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2