Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
9 - 13 Lacs
Coimbatore
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Microsoft SQL Server, Microsoft SQL Server Administration, Microsoft SQL Server Reporting Services Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will be responsible for developing and configuring software systems, applying knowledge of technologies, methodologies, and tools to support clients or projects, either end-to-end or for a specific stage of the product lifecycle. Roles & Responsibilities: Roles & Responsibilities:At least 10+ years of relevant experience in Data Analytics (preferably Microsoft BI stack) fieldLeading a multidisciplinary team of at least 10 professionals, including developers, business analysts, and technical specialists. Overseeing and managing the full lifecycle of projects and software development initiatives Ensuring alignment with organizational policies, regulatory standards, and security best practices. Coordinating cross-functional collaboration with other IT and business units. Driving the continuous improvement of tools, processes, and governance. Developing project roadmaps, assign tasks, and manage timelines. Managing stakeholder communication and ensure delivery of the solutions as per business need. Identifying risks and implement mitigation strategies to ensure data protection and system integrity.KNOWLEDGE AND S Qualifications/Education Required:Degree holder in Computer Science or related discipline Experience Required:This position requires 12 + years of relevant experiences in the IT industry with at least 3 years in Data analytics solution architect or Technical Leader role. Proven record of projects and people management, supporting the development/support of enterprise-wide solutions within major organizations, ideally in a banking environment. Project delivery experience and Team management Competencies RequiredSoftware Development Knowledge. Firm understanding of SDLC Experience in working in Waterfall and Agile settings. Ability to work with developers and contribute to technical discussions. Project Management Skills. Proven ability to lead and manage large-scale projects, including but not limited by vulnerability remediation, obsolescence management, and new software development. Team Leadership and Development. Experience in leading, coaching, and developing a diverse team of technical and non-technical professionals. Stakeholder Management. Ability to engage with senior management, business units, and external vendors to align IAM solutions with business objectives. Risk Management. Ability to identify, assess, and mitigate risks related to identity and access management, ensuring compliance with security standards. Communication Skills. Strong verbal and written communication abilities to effectively convey technical concepts to both technical and non-technical stakeholders. Strategic Thinking. Ability to align the team initiatives with long-term business goals, anticipating future needs Strong analytical and problem solving skills Skills & Knowledge Requirements Hands-on technical expert in the development of data analytics solutions, able to propose solutions to the business with a consulting approach Excellent experience with Power BI, MSBI stack of solutions (SSIS, SSRS, SSAS and SQL Server), DWH solutions and concepts Aware of modern data analytics developments and solutioning Experience in architecture, design, creation & delivery of solutions Understanding of RESTful APIs, and microservices architecture. Understanding of large data processing and manipulations. Knowledge of DevOps practices and CI/CD pipelines. Proficiency with build automation tools and version control tools, e.g. GitLab, Azure DevOps Knowledge of unit testing frameworks and static/dynamic code analysis tools to ensure code quality and security. Additional Information:- The candidate should have a minimum of 12 years of experience in Denodo Data Virtualization Platform- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education
Posted 5 days ago
6.0 - 11.0 years
15 - 27 Lacs
Kochi, Pune, Bengaluru
Work from Office
Role & responsibilities Denodo Data Engineer Preferred candidate profile
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
The primary focus of this role will be to perform development work within the Azure Data Lake environment and other related ETL technologies. You will be responsible for ensuring on-time and on-budget delivery, satisfying project requirements while adhering to enterprise architecture standards. In addition, this role will involve L3 responsibilities for ETL processes. Your responsibilities will include delivering key Azure Data Lake projects within the specified time and budget. You will contribute to solution design and build to ensure scalability, performance, and reuse of data and other components. It is essential to ensure on-time and on-budget delivery that meets project requirements while following enterprise architecture standards. Strong problem-solving abilities are required, focusing on managing business outcomes through collaboration with various internal and external stakeholders. You should be enthusiastic, willing to learn, and continuously develop skills and techniques, embracing change and seeking continuous improvement. Effective communication, both written and verbal, with good presentational skills in the English language is necessary. Being customer-focused and a team player is also important. Qualifications: - Bachelor's degree in computer science, MIS, Business Management, or related field - Minimum 5 years of experience in Information Technology - Minimum 4 years of experience in Azure Data Lake Technical Skills: - Proven experience in development activities in Data, BI, or Analytics projects - Experience in solutions delivery with knowledge of system development lifecycle, integration, and sustainability - Strong knowledge of Pyspark and SQL - Good understanding of Azure Data Factory or Databricks - Desirable knowledge of Presto/Denodo - Desirable knowledge of FMCG business processes Non-Technical Skills: - Excellent remote collaboration skills - Experience working in a matrix organization with diverse priorities - Exceptional written and verbal communication skills, collaboration, and listening skills - Ability to work with agile delivery methodologies - Ability to ideate requirements and design iteratively with business partners without formal requirements documentation,
Posted 1 week ago
5.0 - 10.0 years
22 - 27 Lacs
Kochi
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a part of BSH Home Appliances Group, you will be responsible for the further development and management of the "Master Data Management (MDM) Data Quality" area. Your role will involve collecting, assessing, and prioritizing requirements in close collaboration with business units and IT teams. You will lead the implementation of data quality initiatives to ensure high data quality across the organization. Additionally, you will be responsible for reporting, analyzing, and visualizing data quality metrics using tools such as PowerBI, as well as handling data integration and creating dashboards utilizing Microsoft PowerBI, Backend Development, DENODO, SAP R3, S4 HANA, and Data Integration. To excel in this role, you should possess excellent stakeholder management and moderation skills. You are expected to have a structured, solution-oriented, and independent working style. Experience working in an Agile environment is crucial, along with the ability to adapt to changing priorities and collaborate effectively with cross-functional teams. The ideal candidate will have 6 or more years of experience in designing, developing, and maintaining interactive PowerBI dashboards, with at least 2 years of experience as a Product Owner. At BSH Home Appliances Group, we offer competitive benefits including GTLI and GMC. If you are ready to take on this exciting opportunity and grow your career in a dynamic environment, we invite you to visit bsh-group.com/career and join our team.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
As an integral part of American Airlines Tech Hub in Hyderabad, India, you will have the opportunity to contribute to the innovative and tech-driven environment that shapes the future of travel. Your role will involve collaborating with source data application teams and product owners to develop and support analytics solutions that provide valuable insights for informed decision-making. By leveraging Azure products and services such as Azure Data Lake Storage, Azure Data Factory, and Azure Databricks, you will be responsible for implementing data migration and engineering solutions to enhance the airline's digital capabilities. Your responsibilities will encompass various aspects of the development lifecycle, including design, cloud engineering, data modeling, testing, performance tuning, and deployment. Working within a DevOps team, you will have the chance to take ownership of your product and contribute to the development of batch and streaming data pipelines using cloud technologies. Adherence to coding standards, best practices, and security guidelines will be crucial as you collaborate with a multidisciplinary team to deliver technical solutions effectively. To excel in this role, you should have a Bachelor's degree in a relevant technical discipline or equivalent experience, along with a minimum of 1 year of software solution development experience using agile methodologies. Proficiency in SQL for data analytics and prior experience with cloud development, particularly in Microsoft Azure, will be advantageous. Preferred qualifications include additional years of software development and data analytics experience, as well as familiarity with tools such as Azure EventHub, Azure Power BI, and Teradata Vantage. Your success in this position will be further enhanced by expertise in the Azure Technology stack, practical knowledge of Azure cloud services, and relevant certifications such as Azure Development Track and Spark Certification. A combination of development, administration, and support experience in various tools and platforms, including scripting languages, data platforms, and BI analytics tools, will be beneficial for your role in driving data management and governance initiatives within the organization. Effective communication skills, both verbal and written, will be essential for engaging with stakeholders across different levels of the organization. Additionally, your physical abilities should enable you to perform the essential functions of the role safely and successfully, with or without reasonable accommodations as required by law. At American Airlines, diversity and inclusion are integral to our workforce, fostering an inclusive environment where employees can thrive and contribute to the airline's success. Join us at American Airlines and embark on a journey where your technical expertise and innovative spirit will play a pivotal role in shaping the future of travel. Feel free to be yourself as you contribute to the seamless operation of the world's largest airline, caring for people on life's journey.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As an Engineer, IT Data at American Airlines, you will be part of a diverse, high-performing team dedicated to technical excellence. Your primary focus will be on delivering unrivaled digital products that drive a more reliable and profitable airline. The Data Domain you will work in encompasses managing and leveraging data as a strategic asset, including data management, storage, integration, and governance. This domain also involves Machine Learning, AI, Data Science, and Business Intelligence. In this role, you will collaborate closely with source data application teams and product owners to design, implement, and support analytics solutions that provide insights to make better decisions. You will be responsible for implementing data migration and data engineering solutions using Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, among others, as well as traditional data warehouse tools. Your tasks will span multiple aspects of the development lifecycle, including design, cloud engineering (Infrastructure, network, security, and administration), ingestion, preparation, data modeling, testing, CICD pipelines, performance tuning, deployments, consumption, BI, alerting, and prod support. Furthermore, you will provide technical leadership within a team environment and work independently. As part of a DevOps team, you will completely own and support the product, implementing batch and streaming data pipelines using cloud technologies. Your responsibilities will also include leading the development of coding standards, best practices, and privacy and security guidelines, as well as mentoring others on technical and domain skills to create multi-functional teams. For success in this role, you will need a Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering, or a related technical discipline, or equivalent experience/training. You should have at least 3 years of software solution development experience using agile, DevOps, operating in a product model, as well as 3 years of data analytics experience using SQL. Additionally, a minimum of 3 years of cloud development and data lake experience, preferably in Microsoft Azure, is required. Preferred qualifications include 5+ years of software solution development experience using agile, dev ops, a product model, and 5+ years of data analytics experience using SQL. Experience in full-stack development, preferably in Azure, and familiarity with Teradata Vantage development and administration are also preferred. Airline industry experience is a plus. In terms of skills, licenses, and certifications, you should have expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation, and creating consumption layers. An Azure Development Track Certification and Spark Certification are preferred. Proficiency in several tools/platforms such as Python, Spark, Unix, SQL, Teradata, Cassandra, MongoDB, Oracle, SQL Server, ADLS, Snowflake, and more is required. Additionally, experience with Azure Cloud Technologies, CI/CD tools, BI Analytics Tool Stack, and Data Governance and Privacy tools is beneficial for this role.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As an Engineer, IT Data at American Airlines, you will be part of a diverse and high-performing team dedicated to technical excellence. Your main focus will be on delivering unrivaled digital products that drive a more reliable and profitable airline. The Data Domain you will be working in refers to the area within Information Technology that focuses on managing and leveraging data as a strategic asset. This includes data management, storage, integration, and governance, leaning into Machine Learning, AI, Data Science, and Business Intelligence. In this role, you will work closely with source data application teams and product owners to design, implement, and support analytics solutions that provide insights to make better decisions. You will be responsible for implementing data migration and data engineering solutions using Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, etc., as well as traditional data warehouse tools. Your responsibilities will involve multiple aspects of the development lifecycle including design, cloud engineering, ingestion, preparation, data modeling, testing, CICD pipelines, performance tuning, deployments, consumption, BI, alerting, and production support. You will provide technical leadership, collaborate within a team environment, and work independently. Additionally, you will be part of a DevOps team that completely owns and supports the product, implementing batch and streaming data pipelines using cloud technologies. As an essential member of the team, you will lead the development of coding standards, best practices, privacy, and security guidelines. You will also mentor others on technical and domain skills to create multi-functional teams. Your success in this role will require a Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering, or a related technical discipline, or equivalent experience/training. To excel in this position, you should have at least 3 years of software solution development experience using agile, DevOps, and operating in a product model. Moreover, you should have 3+ years of data analytics experience using SQL and cloud development and data lake experience, preferably with Microsoft Azure. Preferred qualifications include 5+ years of software solution development experience, 5+ years of data analytics experience using SQL, 3+ years of full-stack development experience, and familiarity with Azure technologies. Additionally, skills, licenses, and certifications required for success in this role include expertise with the Azure Technology stack, practical direction within Azure Native cloud services, Azure Development Track Certification, Spark Certification, and a combination of Development, Administration & Support experience with various tools/platforms such as Scripting (Python, Spark, Unix, SQL), Data Platforms (Teradata, Cassandra, MongoDB, Oracle, SQL Server, ADLS, Snowflake), Azure Cloud Technologies, CI/CD tools (GitHub, Jenkins, Azure DevOps, Terraform), BI Analytics Tool Stack (Cognos, Tableau, Power BI, Alteryx, Denodo, Grafana), and Data Governance and Privacy tools (Alation, Monte Carlo, Informatica, BigID). Join us at American Airlines, where you can explore a world of possibilities, travel the world, grow your expertise, and become the best version of yourself while contributing to the transformation of technology delivery for our customers and team members worldwide.,
Posted 2 weeks ago
10.0 - 15.0 years
30 - 40 Lacs
Bengaluru
Hybrid
We are looking for a Cloud Data Engineer with strong hands-on experience in data pipelines, cloud-native services (AWS), and modern data platforms like Snowflake or Databricks. Alternatively, were open to Data Visualization Analysts with strong BI experience and exposure to data engineering or pipelines. You will collaborate with technology and business leads to build scalable data solutions, including data lakes, data marts, and virtualization layers using tools like Starburst. This is an exciting opportunity to work with modern cloud tech in a dynamic, enterprise-scale financial services environment. Key Responsibilities: Design and develop data pipelines for structured/unstructured data in AWS. Build semantic layers and virtualization layers using Starburst or similar tools. Create intuitive dashboards and reports using Power BI/Tableau. Collaborate on ETL designs and support testing (SIT/UAT). Optimize Spark jobs and ETL performance. Implement data quality checks and validation frameworks. Translate business requirements into scalable technical solutions. Participate in design reviews and documentation. Skills & Qualifications: Must-Have: 10+ years in Data Engineering or related roles. Hands-on with AWS Glue, Redshift, Athena, EMR, Lambda, S3, Kinesis. Proficient in HiveQL, Spark, Python, Scala. Experience with modern data platforms (Snowflake/Databricks). 3+ years in ETL tools (Informatica, SSIS) & recent experience in cloud-based ETL. Strong understanding of Data Warehousing, Data Lakes, and Data Mesh. Preferred: Exposure to Data Virtualization tools like Starburst or Denodo. Experience in financial services or banking domain. AWS Certification (Data specialty) is a plus.
Posted 2 weeks ago
10.0 - 18.0 years
25 - 32 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Techwave , we are always in an exercise to foster a culture of growth, and inclusivity. We ensure whoever is associated with the brand is being challenged at every step and is provided with all the necessary opportunities to excel in life. People are at the core of everything we do. Join us! https://techwave.net/join-us/ Who are we? Techwave is a leading global IT and engineering services and solutions company revolutionizing digital transformations. We believe in enabling clients to maximize the potential and achieve a greater market with a wide array of technology services, including, but not limited to, Enterprise Resource Planning, Application Development, Analytics, Digital, and the Internet of things (IoT). Founded in 2004, headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to enable businesses accelerate their growth. Plus, we're a team of dreamers and doers who are pushing the boundaries of what's possible. And we want YOU to be a part of it. Job Title: Data Lead Experience: 10+ Years Mode of Hire : Fulltime Key Skills: Responsibilities We are seeking a senior level ETL developer ( 10-13 years of experience) database/ETL developer responsible for building relational and data warehousing applications. Primary responsibility will be to support existing EDW, design and develop different layers of our data and test, document the ETL process. Designs and develops framework and services according to specifications within a team environment. Prepares detailed system documentation including requirements, specifications, test plans and user manuals. Performs unit and system tests and as needed, validation testing. Coordinates with Operations staff on deployment of applications. Ensures all activities are performed with quality and compliance. Design and implementation of ETL batches that meet the SLAs. Development of data collection, data staging, data movement, data quality and archiving strategies. Plan and conduct ETL Unit and development tests, monitoring results and taking corrective action when necessary. Experience in handling "slow-changing" dimensions using ETL. Design automation processes to control data access, transformation and movement and ensures source system data availability. Assists with database design. Expected to have solid understanding of database design principles and database administration methods and techniques. Perform data integrity checks and performance of data structures. Ability to write complex SQL queries, dynamic SQL, Stored procedures. Ability to work on Data Warehouse migration from existing platform to Snowflake. Preparing time estimates and justification for tasks assigned. Required Skills:- 8-10Yrs of ETL/ELT Experience Very strong SQL skills, Stored Procedures and database development skills 3-5yrs of experience in Azure Data Lake, Synapse, Azure Data Factory and Databricks. 3-5yrs of experience in Snowflake . A good understanding of the concepts and best practices of data warehouse ETL and ELT design and building relational databases. Ability to work independently and self-starter Strong database experience in DB2, SQL Server, Azure Strong in designing Relational and Dimensional Data modeling. Good understanding on Enterprise reporting primarily on Power BI Understanding of Agile practices and methodology is a plus. Assist with the analysis and extraction of relevant information from large amounts of historical business data to feed Business Intelligence initiatives Handson experiences conducting Proof of concept for new technology selection and proposing new data warehouse architecture
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
As an American Airlines team member in the Tech Hub in Hyderabad, India, you will have the opportunity to be part of a diverse, high-performing team dedicated to technical excellence. Your primary focus will be on delivering unrivaled digital products that drive a more reliable and profitable airline. The Data Domain you'll be working in is centered around managing and leveraging data as a strategic asset, including data management, storage, integration, and governance, with a strong emphasis on Machine Learning, AI, Data Science, and Business Intelligence. In this role, you will work closely with source data application teams and product owners to design, implement, and support analytics solutions that provide valuable insights for better decision-making. You will be responsible for implementing data migration and data engineering solutions using Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, and more, as well as traditional data warehouse tools. Your responsibilities will include various aspects of the development lifecycle, such as design, cloud engineering, data modeling, testing, performance tuning, deployments, BI, alerting, and production support. You will collaborate within a team environment and independently to develop technical solutions. As part of a DevOps team, you will have ownership and support for the product you work on, implementing both batch and streaming data pipelines using cloud technologies. To be successful in this role, you should have a Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems, or a related technical discipline, or equivalent experience. You should have at least 1+ years of software solution development experience using agile, DevOps, and data analytics experience using SQL. Experience with cloud development and data lake technologies, particularly in Microsoft Azure, is preferred. Preferred qualifications include additional years of experience in software solution development, data analytics, full-stack development, and specific experience with Azure technologies. Skills in scripting languages like Python, Spark, Unix, SQL, as well as expertise with the Azure Technology stack and various data platforms and BI Analytics tools are highly valued. Certifications such as Azure Development Track and Spark are preferred. Effective communication skills are essential for this role, as you will need to collaborate with team members at all levels within the organization. Physical abilities are also necessary to perform the essential functions of the position safely. American Airlines values inclusion and diversity, providing a supportive environment for all team members to reach their full potential. If you are ready to be part of a dynamic, tech-driven environment where your creativity and strengths are celebrated, join American Airlines in Hyderabad and immerse yourself in the exciting world of technological innovation. Feel free to be yourself and contribute to keeping the largest airline in the world running smoothly as we care for people on life's journey.,
Posted 3 weeks ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Denodo Data Virtualization Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements in Hyderabad. You will play a crucial role in developing innovative solutions for business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing Denodo Data Virtualization Platform effectively- Ensure seamless integration of applications with the platform- Optimize performance and troubleshoot any issues that arise Professional & Technical Skills: - Must To Have Skills: Proficiency in Denodo Data Virtualization Platform- Strong understanding of data virtualization concepts- Experience in designing and implementing data virtualization solutions- Knowledge of SQL and database management- Familiarity with ETL processes and data integration techniques Additional Information:- The candidate should have a minimum of 5 years of experience in Denodo Data Virtualization Platform- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project Role : Technology Platform Engineer Project Role Description : Creates production and non-production cloud environments using the proper software tools such as a platform for a project or product. Deploys the automation pipeline and automates environment creation and configuration. Must have skills : Denodo Data Virtualization Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : equivalent education Job Title:Denodo Data Virtualization PlatformExperience Level:Team Lead (Level 9)Job Overview:We are seeking a skilled and motivated Denodo Data Virtualization Engineer to join our optimize our data virtualization environment. The ideal candidate will have hands-on experience with the Denodo Platform and a solid understanding of responsible for configuring, maintaining, and troubleshooting the Denodo Platform, ensuring seamless data integration and availability across various business units. You will work with cross-functional teams to deliver agile data solutions, connecting disparate data sources for real-time access and analysis.Key Responsibilities:Install, configure, and upgrade Denodo Platform components.Monitor system performance and proactively resolve issues to ensure high availability.Administer Denodo security policies, including access control and user management.Collaborate with data engineers and architects to optimize data virtualization solutions.Develop and maintain Denodo monitoring scripts for performance tracking.Troubleshoot integration issues between Denodo and data sources such as SQL databases, APIs, and cloud storage.Manage metadata, caching, and query optimization within the Denodo Platform.Provide technical support and training to business users and developers.Required Qualifications & Skills: Proven experience as a Denodo Platform Administrator or similar role.Strong knowledge of Denodo Platform architecture, caching, and query optimization.Expertise in SQL, data virtualization, and data integration concepts.Hands-on experience with Azure cloud platforms and APIs.Familiarity with ETL tools, database management, and security best practices.Excellent troubleshooting skills and ability to work in a fast-paced environment.Strong communication skills for collaboration with cross-functional teams.Preferred Qualifications:Denodo Certified Professional certification Additional Information:- The candidate should have minimum 5 years of experience in Denodo Data Virtualization Platform.- This position is based at our Bengaluru office.- An equivalent education is required. Qualification equivalent education
Posted 3 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 4 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Job Requirement:- Denodo Developer Experience- 5- 9 years Location: Bangalore, Noida, Chennai, Mumba, Hyderabad, Pune Shift Time - CET (12:30 to 9:30 IST) Job Description: We are seeking a highly skilled and experienced Denodo Developer with a strong background in ETL processes and deep knowledge of the Life Sciences domain. The ideal candidate will be responsible for developing data virtualization solutions, integrating complex datasets from multiple sources, and enabling real-time data access for analytics and operational reporting. This role requires close collaboration with data architects, data engineers, and business stakeholders in a regulated environment. Key Proficiency & Responsibilities: Design, develop, and optimize data virtualization solutions using Denodo Platform. Integrate structured and unstructured data sources into Denodo views and services. Develop custom views, VQL scripts, and data services (REST/SOAP). Build and optimize ETL/ELT pipelines to support data ingestion and transformation. Work closely with Life Sciences business teams to translate domain-specific requirements into data solutions. Implement data governance, security, and compliance practices adhering to GxP and FDA regulations. Provide support for data access, lineage, metadata management, and user training. Collaborate with cross-functional teams in an Agile development environment. Optimize workflows for performance and scalability. Develop and maintain data documentation, including workflow descriptions and data dictionaries. Strong knowledge of data preparation, ETL concepts, and data warehousing. Excellent analytical, problem-solving, and communication skills. Proficient in VQL, JDBC, ODBC, and web services integration. Strong expertise in ETL tools (e.g., Informatica, Talend, DataStage, or Azure Data Factory). Deep understanding of Life Sciences domain clinical trials, regulatory data, pharmacovigilance, or research & development. Preferred Qualifications: B.Tech. or MCA from a recognized University Minimum 5+ years of relevant experience as a Denodo Developer. Strong SQL and database skills (Oracle, SQL Server, PostgreSQL, etc.). Knowledge of data modelling, data warehousing, and virtual data layers. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience working in Agile/Scrum environments. Exposure to cloud platforms such as AWS, Azure, or GCP. Interested in or know someone great? Send your resume to durgesh.kumar @mounttalent.com or DM me directly.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Denodo Data Virtualization Platform, Microsoft SQL Server Reporting Services, Microsoft SQL Server Administration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will oversee the development process and lead the team to success, ensuring project milestones are met and quality standards are maintained. You will collaborate with various teams to drive key decisions and deliver innovative solutions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead team meetings to discuss progress and challenges- Mentor junior team members to enhance their skills- Identify opportunities for process improvement and implement solutions Professional & Technical Skills: - Must To Have Skills: Proficiency in Denodo Data Virtualization Platform, Microsoft SQL Server Administration, Microsoft SQL Server Reporting Services- Strong understanding of data virtualization concepts- Experience in designing and implementing data virtualization solutions- Knowledge of SQL query optimization and performance tuning- Ability to troubleshoot and resolve database issues Additional Information:- The candidate should have a minimum of 5 years of experience in Denodo Data Virtualization Platform- This position is based at our Mumbai office- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
12.0 - 17.0 years
6 - 10 Lacs
Mumbai
Work from Office
Role Overview : We are looking for an experienced Denodo SME to design, implement, and optimize data virtualization solutions using Denodo as the enterprise semantic and access layer over a Cloudera-based data lakehouse. The ideal candidate will lead the integration of structured and semi-structured data across systems, enabling unified access for analytics, BI, and operational use cases. Key Responsibilities: Design and deploy the Denodo Platform for data virtualization over Cloudera, RDBMS, APIs, and external data sources. Define logical data models , derived views, and metadata mappings across layers (integration, business, presentation). Connect to Cloudera Hive, Impala, Apache Iceberg , Oracle, and other on-prem/cloud sources. Publish REST/SOAP APIs, JDBC/ODBC endpoints for downstream analytics and applications. Tune virtual views, caching strategies, and federation techniques to meet performance SLAs for high-volume data access. Implement Denodo smart query acceleration , usage monitoring, and access governance. Configure role-based access control (RBAC) , row/column-level security, and integrate with enterprise identity providers (LDAP, Kerberos, SSO). Work with data governance teams to align Denodo with enterprise metadata catalogs (e.g., Apache Atlas, Talend). Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required : 8–12 years in data engineering, with 4+ years of hands-on experience in Denodo Platform . Strong experience integrating RDBMS (Oracle, SQL Server), Cloudera CDP (Hive, Iceberg), and REST/SOAP APIs. Denodo Admin Tool, VQL, Scheduler, Data Catalog; SQL, Shell scripting, basic Python (preferred). Deep understanding of query optimization , caching, memory management, and federation principles. Experience implementing data security, masking, and user access control in Denodo.
Posted 1 month ago
2.0 - 5.0 years
5 - 8 Lacs
Madhwapur
Work from Office
Roles and Responsibility Collaborate with cross-functional teams to design, develop, and deploy Denodo solutions. Develop high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in Denodo development. Contribute to the development of new features and functionalities. Job Requirements Proficiency in Denodo development is mandatory. Strong understanding of software development principles and methodologies. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with agile development methodologies and version control systems.
Posted 1 month ago
5.0 - 10.0 years
11 - 21 Lacs
Pune, Peth
Work from Office
Contract to Hire Position Denodo version: 8.0 Denodo platform: prefer cloud (AWS, Azure). Denodo role: prefer development, (with basic admin knowledge) Denodo advanced skills: performance tuning, troubleshooting issues, code fixes, unit testing, optimization. Denodo development skills: creating api, creating source and target connections, creating complex views.
Posted 1 month ago
4.0 - 9.0 years
20 - 35 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Share your updated CV at jatin@smrd.in
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 1 month ago
5.0 - 9.0 years
18 - 27 Lacs
Pune, Chennai, Bengaluru
Work from Office
Gather complete understanding of the requirements and task out Epics/user stories planned in a Release/Sprint Understand the current and proposed design and prepare LLD for Epics considering the dependencies across domains, other teams Required Candidate profile Skill Specialization Jboss Fuse Mandatory Skills Java+Apache Camel/JBOSS Fuse,Java Microservices Desired Skills Denodo,SQL Arrive WBS for Epics planned in a Release/Sprint Code Review
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Kochi
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Non-Degree Program Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 1 month ago
4.0 - 9.0 years
10 - 20 Lacs
Gurugram
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain ETL processes using SSIS to extract data from various sources. Develop complex SQL queries to retrieve data from relational databases such as SQL Server. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Troubleshoot issues related to ETL process failures or performance problems. Ensure compliance with security standards by implementing Denodo Platform for data masking. Desired Candidate Profile 4-9 years of experience in ETL development with expertise in Agile methodology. Strong understanding of .NET Core, C#, Microsoft Azure, Big Query, SSRS (SQL Reporting Services), SSIS (SQL Server Integration Services). B.Tech/B.E. degree in Any Specialization. Hands-on experience in Database (MS SQL Server, Big Query, Denodo) Experience in NET / Visual Studio (SSRS ,SSIS & ETLs Package) Good Knowledge in requirement elicitation From workshops/meetings to Agile board epic/features/stories
Posted 1 month ago
2.0 - 5.0 years
5 - 14 Lacs
Bengaluru
Hybrid
Denodo Developer/Admin - Candidate should have Denodo Dev/ Admin with 2 to 5 years of experience - Should have experience on Denodo backend services - Should have exposure to any cloud AWS/Azure/GCP - Proficient in SQL - Good to have Support experience - Should be willing to work in rotational shift
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough