Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 - 15.0 years
20 - 25 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director, Tech SME In this role, you will: Design and engineer software with the customer/user experience as a key objective. Actively contributes to Technology Engineering Practice by sharing Subject Matter Expertise from their area of specialism, best practice and learnings. Drives adherence to all standards and policies within their area of Technology Engineering Delivery and support of data related infrastructure and architecture to optimise data storage and consumption across the bank, including addressing functional and non-functional requirements relevant to data in large applications Engineer and implement security measures for the protection of internal systems, and services. Establish an environment to minimize variation and ensure predictable high quality code and data Assist in team development while holding teams accountable for their commitments, removing roadblocks to their work; leveraging organizational resources to improve capacity for project work; and mentoring and developing team members. Promote empowerment of the team, ensure that each team member is fully engaged in the project and making a meaningful contribution, and encourage a sustainable pace with high levels of quality for the team. Managing stakeholder communications and helping to implement an effective system of project governance Requirements To be successful in this role, you should meet the following requirements: Knowledge of DataStage , Oracle and Unix Experience in SpringBoot API , Kubernetes , Postman , GCP Experience on design and implement DevOps Continuous Integration / Continuous Delivery (CI/CD) Pipeline. Experience supporting middleware / database design, build and troubleshooting in development and production environments. Experience of working with monitoring and alerting tools such as but not limited to AppDynamics and Splunk Experience in agile and DevOps environment using team collaboration tools such as Github, Confluence and JIRA. Working with Ops and Dev Engineers to ensure operational issues are identified and addressed at all stages of a product or service release / change. Provide support in identification and resolution of all incidents associated with the IT service Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Keep up to date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable Technical leadership of large team of developers and help development of team capabilities. Work with senior business stakeholders
Posted 3 weeks ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Work from Office
Job Title:Snowflake Developer Experience6-8 Years Location:Chennai - Hybrid : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 3 weeks ago
3.0 - 7.0 years
8 - 13 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and , ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist . In this role, you will: A senior full stack engineer with deep hands-on experience and knowledge in ETL (Extract, Transform, Load) tools e.g. IBM Data stage, SQL, Shell scripting, Control-M job development, API Development, Design Patterns, SDLC, IaC tools, testing and site reliability engineering. Define and implement best practices for software development, framework, and patterns, including coding standards, code reviews, and testing methodologies. Generalist with the breadth and depth of experience in CICD best practices and has core experience in one of the following areas: Software Development (ie. Secure coding/SDLC/API development/clean code management) Testing (ie. TDD/BDD/Automated testing/Contract testing/API testing) Site Reliability Engineering (ie. Release engineering/Observability/Risk management) See a problem or an opportunity with the ability to engineer a solution, is respected for what they deliver not just what they say, should think about the business impact of their work and has a holistic view to problem- solving Have proven industry experience in developing code, defining process, standards, ability to pick up on new technologies, challenges, apply thinking to many problems across multiple technical domains Contributes to architectural discussions by asking the right questions to ensure a solution matches the business needs Identify opportunities for system optimization, performance tuning, and scalability enhancements. Implement solutions to improve system efficiency and reliability. Excellent verbal and written communication skills to articulate technical concepts to both technical and non-technical stakeholders. Requirements To be successful in this role, you must meet the following requirements: Is tech-forward in thinking, actively researching new ideas/processes and is the driving force to adopt them Ability to work across cultures and all locations in a complex, matrix organization, proven experience to deliver engineering solutions to banking or financial services organization, Excellent leadership and team management skills, with the ability to motivate and inspire teams to achieve their goals, with ability to analyze complex technical and business problems and develop effective solutions. Managing operational functions, directing process re-engineering and efficiency exercises. Strong ability to balance risks vs rewards and maximizing the cost effectiveness and profitability for the business, Experience with Agile methodologies and software development processes. Good to have skills: Knowledge on latest technology, tools like Scala, Python, Dataflow, Databricks, Apache Spark SQL, Hadoop, REST API, databases, Hadoop, Kafka, Cloud technologies e.g. GCP, AWS, will be an added advantage
Posted 3 weeks ago
4.0 - 8.0 years
0 - 1 Lacs
Bengaluru
Work from Office
Were Hiring: Sr. Software Engineer SnapLogic | Bangalore | 48 Years Experience Job Title: Sr. Software Engineer – SnapLogic Location: Bangalore Experience: 4–8 Years Client & Budget: Will be discussed during the call Notice Period: Immediate to 30 Days preferred Key Responsibilities -Design and develop SnapLogic pipelines for enterprise data integration -Migrate ETL jobs into SnapLogic and manage platform moderation on AWS -Work closely with cross-functional teams to gather integration requirements -Configure SnapLogic components (snaps, pipelines, transformations) for optimized performance -Ensure data quality and reliability through well-structured ETL processes -Keep up with new SnapLogic features and best practices to enhance platform usage -Collaborate with business stakeholders to deliver long-term, sustainable solutions Required Skills -SnapLogic: 2–4 years of hands-on experience in pipeline development & debugging -ETL Tools: Experience with tools like DataStage, Informatica -Cloud & Data Warehousing: AWS Cloud exposure and hands-on Snowflake experience -Databases: Strong in SQL, PL/SQL, and RDBMS concepts -ETL Best Practices: Data transformation, cleansing, and mapping Bonus: SnapLogic Developer Certification is a big plus! Why Join Us? -Work on cutting-edge integration projects with modern tech stacks -Be part of a collaborative and forward-thinking engineering team -Opportunity to work with enterprise clients and mission-critical data platforms Ready to Apply? Send your CV to [ YourEmail@example.com ] or DM me to learn more. #Hiring#SnapLogic#ETLDeveloper#SoftwareEngineer#BangaloreJobs#AWS#Snowflake#DataStage#Informatica#SQL#PLSQL#DataIntegration#Hurryup#Applynow#Bengalurujobs#Snaplogicjobs#Referfriends#Hriring#ImmediateJoiner#Rwefercolleuges#Experienced#Datastageskill#linkedinconnection#like#share#refer#opnetowork#urgentopprtunuties#indianjobs#
Posted 3 weeks ago
8.0 - 12.0 years
17 - 25 Lacs
Bengaluru
Work from Office
Dear Candidate, We have job opening for SnapLogic Developer with one of our client . If you are interested in this position, please share update resume in this email id : shaswati.m@bct-consulting.com Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.
Posted 3 weeks ago
8.0 - 10.0 years
15 - 20 Lacs
Gurugram
Work from Office
Position Summary: We are looking for an experienced Microsoft 365 Specialist to join our dynamic team for streamlining the enterprise Project data. The ideal candidate will possess a strong proficiency in Microsoft 365 applications and Generative AI tools, along with extensive knowledge of data governance principles. This role will focus on data aggregation, integration, and the development of a robust data architecture to ensure data integrity and accessibility across multiple digital projects in the organization. The candidate should be capable of acting as a developer to build a future-proof architecture that connects various data storage options in our Digital business groups. This would make our digital projects future-proof and AI implementation ready with respect to data flow, data quality and lead to overall operational excellence A Snapshot of your Day How You'll Make an Impact (responsibilities of role) Utilize the full suite of Microsoft 365 applications to streamline data & workflows across different Digital Projects and segments. Customization of the same (as required) will be needed. Act as a developer to build a future-proof architecture that connects various data storage options, including applications, cloud services, drives, and SharePoint etc. Designed architecture shall consolidate fragmented data from various sources to create a single, reliable source of truth for accurate reporting and analysis Integrate and leverage Generative AI tools, such as Co-Pilot, to improve data analysis and reporting capabilities Implement data governance policies, workflows and practices to ensure data quality, security, and compliance with relevant regulations Experience in data integration and transformation techniques, including ETL (Extract, Transform, Load) processes, to ensure data consistency and accuracy Collaborate with stakeholders to identify data needs and ensure accurate reporting and analysis Ensure data integrity and accessibility across the organization, enabling informed decision-making Communicate effectively with cross-functional teams and stakeholders to understand data requirements and deliver solutions that meet business needs Provide training and support to team members on data governance policies, procedures and required operability of Microsoft 365 tools Keep abreast of new features and capabilities in Microsoft 365 related to data governance. What You Bring Bachelor's/master's degree in information technology or computer science, or a related field. 8 to 10 years of experience in developing architectures for data governance. Proven experience with Microsoft 365 applications and Generative AI tools, like Co-Pilot Strong understanding of data governance principles, practices, and policies Experienced in utilizing a variety of database management systems and data exchange formats to optimize data storage, retrieval, and interoperability Knowledge of relevant industry regulations and standards . Proficiency in data architecture design Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders Strong problem-solving skills and the ability to work collaboratively in a dynamic team environment across the globe
Posted 3 weeks ago
5.0 - 8.0 years
7 - 11 Lacs
Gurugram
Work from Office
Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 3 weeks ago
6.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Diverse Lynx is looking for Datastage Developer to join our dynamic team and embark on a rewarding career journey Analyzing business requirements and translating them into technical specifications Designing and implementing data integration solutions using Datastage Extracting, transforming, and loading data from various sources into target systems Developing and testing complex data integration workflows, including the use of parallel processing and data quality checks Collaborating with database administrators, data architects, and stakeholders to ensure the accuracy and consistency of data Monitoring performance and optimizing Datastage jobs to ensure they run efficiently and meet SLAs Troubleshooting issues and resolving problems related to data integration Knowledge of data warehousing, data integration, and data processing concepts Strong problem-solving skills and the ability to think creatively and critically Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders
Posted 3 weeks ago
3.0 - 7.0 years
13 - 14 Lacs
Hyderabad
Work from Office
To influence key stakeholders to achieve the best-desired outcome. Responsible for translating detailed designs into robust, scalable and reusable solutions that deliver exceptional user experience and communicate the design and key design decisions to related parties. Carrying out the detailed technical analysis of projects, changes and implementations to production. A desire to find ways to continually improve the service delivered to customers You will be working in product development & Production support as per the project requirements. You may need to work on UK shifts and occasionally on weekends as per project needs. As part of Production support, you need to be understanding and solving the issues or alteast need to be vigilant in escalating the ticket within the timelines if not able to solve the ticket or required another team’s help. Should be able to identify the bottlenecks of the process and automate them wherever feasible. Should be able to proactively identify and resolve the issues wherever required. Self-motivated, focused and able to work efficiently to deadlines are essential. Requirements To be successful in this role, you should meet the following requirements: Must have Good Knowledge on Oracle Hyperion financial Management (HFM), Financial Data Management Enterprise Edition (FDMEE), IT Infrastructure architecture design and implementation Must have good knowledge and experience on PowerShell Scripting, Oracle DB, SQL, Control M & DataStage. Good knowledge of Devops tooling like Ansible, G3, Ci/CD pipeline. Knowledge of collaboration tools preferably JIRA and Confluence. Good knowledge on HSBC Internal systems like Service now, ICE controls, DMOV, DUSE etc. Good understanding on Incident Management/Change Management/Problem Management. Both spoken and written communication skills with experience of adapting your style and approach to the audience and message to be delivered. Good to have Python and Django skills.
Posted 3 weeks ago
4.0 - 5.0 years
6 - 10 Lacs
Kochi, Bengaluru
Work from Office
4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability
Posted 3 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Hyderabad
Work from Office
This position is responsible for design, implementation, and support of MetLifes enterprise data management and integration systems, the underlying infrastructure, and integrations with other enterprise systems and applications using AIX, Linux, or Microsoft Technologies. Provide technical expertise in the planning, engineering, design, implementation and support of data management and integration system infrastructures and technologies. This includes the systems operational procedures and processes. Partner with the Capacity Management, Production Management, Application Development Teams and the Business to ensure customer expectations are maintained and exceeded. Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise. Identify and resolve complex data management and integration system issues (Tier 1 support) utilizing product knowledge and structured troubleshooting tools and techniques. Support Disaster Recovery implementation and testing as required Experience in design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java ) Begin tackling organizational impediments Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelors degree in computer science, Information Systems, or related field. Experience 3+ years of total experience and at least 2+ years of experience in Informatica applications implementation and support of data management and integration system infrastructures and technologies. This includes the systems operational procedures and processes. Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise. Informatica PowerCenter Operating System Knowledge (Linux/Windows/AIX) Azure Dev Ops Pipeline Knowledge Enterprise Scheduling Knowledge (Maestro) Troubleshooting Communications CP4D Datastage Mainframe z/OS Knowledge Experience in creating and working on Service Now tasks/tickets Other Requirements (licenses, certifications, specialized training - if required) Working Relationships Internal Contacts (and purpose of relationship): MetLife internal partners External Contacts (and purpose of relationship) - If Applicable MetLife external partners
Posted 3 weeks ago
2.0 - 3.0 years
3 - 8 Lacs
Pimpri-Chinchwad, Pune
Work from Office
Role & responsibilities Develop, implement & fine-tune deep learning AI models for wide range of Computer Vision application including object classification, object detection, segmentation, OCR & NLP Perform data annotation using automation script & preprocessing to prepare high-quality datasets for training our in-house deep learning models. Training of AI models with model inference and evaluate model accuracy & performance metrics using frameworks like PyTorch, Tensorflow Collaborate with software engineers to deploy models in production systems, ensuring scalability, reliability and efficiency. Develop APIs using FastAPI or similar frameworks to enable seamless integration and interaction with deployed models Experience in Deep Learning, Computer Vision, NLP with a focus on object classification, detection, segmentation, OCR & text processing. Pytho , C++, Understanding of deep learning frameworks such as Tensorflow, PyTorc,h or Keras Experience with AI model training,hyperparameters tunin,g and evaluation of large-scale datasets. Data Annotation familiarity & preprocessing techniques to prepare datasets for deep learning models. Version control systems like Git Problem-solving skills, communication skills, collaboration skills, team player AI Model deployment & serving, including the use of APIs and frameworks like FastAPI
Posted 3 weeks ago
3 - 5 years
3 - 7 Lacs
Chennai
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Mandatory Skills: SQL Server. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
5 - 10 years
7 - 12 Lacs
Noida
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft SQL Server Integration Services (SSIS) Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Graduate Required Summary : As an Application Lead for Packaged Application Development, you will be responsible for designing, building, and configuring applications using Microsoft SQL Server Integration Services (SSIS). Your typical day will involve leading the effort to deliver high-quality applications, acting as the primary point of contact for the project team, and ensuring timely delivery of project milestones. Roles & Responsibilities: - Lead the effort to design, build, and configure applications using Microsoft SQL Server Integration Services (SSIS). - Act as the primary point of contact for the project team, ensuring timely delivery of project milestones. - Collaborate with cross-functional teams to ensure the successful delivery of high-quality applications. - Provide technical guidance and mentorship to junior team members, ensuring their professional growth and development. - Stay updated with the latest advancements in Packaged Application Development, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills:Strong experience in Microsoft SQL Server Integration Services (SSIS). - Good To Have Skills:Experience in other ETL tools like Informatica, Talend, or DataStage. - Experience in designing, building, and configuring applications using Microsoft SQL Server Integration Services (SSIS). - Strong understanding of database concepts and SQL programming. - Experience in performance tuning and optimization of ETL processes. - Experience in working with large datasets and complex data structures. Additional Information: - The candidate should have a minimum of 5 years of experience in Microsoft SQL Server Integration Services (SSIS). - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. - This position is based at our Bengaluru office. Qualifications Graduate Required
Posted 1 month ago
5 - 10 years
3 - 7 Lacs
Chennai
Work from Office
Snowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills S?nowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills
Posted 1 month ago
3 - 7 years
6 - 10 Lacs
Pune
Work from Office
About The Role Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications. 1. Applies scientific methods to analyse and solve software engineering problems. 2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance. 3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers. 4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities. 5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. About The Role - Grade Specific Has more than a year of relevant work experience. Solid understanding of programming concepts, software design and software development principles. Consistently works to direction with minimal supervision, producing accurate and reliable results. Individuals are expected to be able to work on a range of tasks and problems, demonstrating their ability to apply their skills and knowledge. Organises own time to deliver against tasks set by others with a mid term horizon. Works co-operatively with others to achieve team goals and has a direct and positive impact on project performance and make decisions based on their understanding of the situation, not just the rules. Skills (competencies) Verbal Communication
Posted 1 month ago
5 - 10 years
9 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Title: Senior Software Engineer - ETL Developer Main location: Hyderabad/ Bangalore / Chennai Employment Type: Full Time Experience: 5 to 10 yrs Role & responsibilities : Sr ETL Developer Position Description Looking for a Senior ETL Developer who has: ETL Development & Implementation Strong experience in designing, developing, and deploying ETL solutions using Informatica Cloud Services (ICS), Informatica PowerCenter, and other data integration tools. • Data Integration & Optimization Proficient in extracting, transforming, and loading (ETL) data from multiple sources, optimizing performance, and ensuring data quality. • Stakeholder Collaboration Skilled at working with cross-functional teams, including data engineers, analysts, and business stakeholders, to align data solutions with business needs. • Scripting & Data Handling Experience with SQL, PL/SQL, and scripting languages (e.g., Python, Shell) for data manipulation, transformation, and automation. • Tool Proficiency Familiarity with Informatica Cloud, version control systems (e.g., Git), JIRA, Confluence, and Microsoft Office Suite. • Agile Methodologies Knowledge of Agile frameworks (Scrum, Kanban) with experience in managing backlogs, writing user stories, and participating in sprint planning. • Testing & Validation Involvement in ETL testing, data validation, unit testing, and integration testing to ensure accuracy, consistency, and completeness of data. • Problem-Solving Skills Strong analytical mindset to troubleshoot, debug, and optimize ETL workflows, data pipelines, and integration solutions effectively. • Communication & Documentation Excellent written and verbal communication skills to document ETL processes, create technical design documents, and present data integration strategies to stakeholders. Your future duties and responsibilities Required qualifications to be successful in this role Together, as owners, lets turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, youll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. Thats why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our companys strategy and direction. Your work creates value. Youll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. Youll shape your career by joining a company built to grow and last. Youll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.
Posted 1 month ago
3 - 8 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and objectives. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software solutions to meet business requirements. Collaborate with cross-functional teams to analyze and address technical issues. Conduct code reviews and provide feedback to enhance code quality. Stay updated with industry trends and best practices in application development. Assist in troubleshooting and resolving technical issues in applications. Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio. Strong understanding of ETL processes and data integration. Experience with data warehousing concepts and methodologies. Hands-on experience in developing and optimizing ETL workflows. Knowledge of SQL and database management systems. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Business Intelligence Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your day will involve collaborating with teams to create innovative solutions and contribute to key decisions. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the development and implementation of new software applications Conduct code reviews and provide technical guidance to team members Professional & Technical Skills: Must To Have Skills: Proficiency in SAP BusinessObjects Business Intelligence Strong understanding of data modeling and data warehousing concepts Experience in developing and optimizing complex SQL queries Knowledge of SAP BusinessObjects tools and technologies Experience in integrating SAP BusinessObjects with other systems Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Business Intelligence This position is based at our Bengaluru office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Snowflake Data Warehouse Minimum 5 year(s) of experience is required Educational Qualification : Minimum 15 years of Full-time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Data Analysis & InterpretationGood to Have Skills :Snowflake Data WarehouseJob Requirements :Key Responsibilities :1 Enable himself with the Accenture Standards and Policies of working in a Project and client environment2 Work with Project Manager and Project Lead to get his Client user accounts created3 Lead the overall Snowflake Transformation journey for the customer4 Design Develop the new solution in Snowflake Datawarehouse 5 Prepare and test strategy and an implementation plan for the solution6 Play role of a End to End Data Engineer Technical Experience :1 2 Years of Hands-on Experience in SNOWFLAKE Datawarehouse Design and Development Projects specifically2 4 Years of Hands-on Experience in SQL Programming Language PLSQL3 1 Years of Experience in JavaScripting or any programming languages Python, ReactJS, Angular4 Good Understanding and Concepts of Cloud Datawarehouse and Datawarehousing concepts and Dimensional Modelling concepts5 1 Year Experience in ETL Technologies - Informatica or DataStage or Talend or SAP BODS or Abinitio, etc Professional Attributes :1 Should be fluent in English communication2 Should have handled direct Client Interactions in the past3 Should be clear in Written Communications4 Should be having strong interpersonal skills5 Should be conscious of European Professional Etiquettes Educational Qualification:Minimum 15 years of Full-time educationAdditional Info :Exposure to AWS and Amazon S3 and other Amazon Cloud Hosting Products related to Analytics or DBs Qualifications Minimum 15 years of Full-time education
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Gurugram
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : IBM Db2 Good to have skills : Database Architecture Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Application Tech Support Practitioner Project Role DescriptionAct as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge.Must to have skillsIBM Db2, SSI:NON SSI: Good to have skillsSSI:NON SSI:Job Requirements Job Title:DB2 for Mainframe Database AdministratorJob Summary :We are seeking a skilled DB2 for Mainframe Database Administrator to manage and maintain our enterprise-level DB2 databases on z/OS. The ideal candidate will be responsible for ensuring the performance, availability, and security of DB2 databases while also planning and implementing database solutions for the future.Key Responsibilities:Database Management:Administer and maintain DB2 databases on IBM z/OS mainframe environments, including installation, configuration, upgrades, and patching.Performance Tuning:Monitor and optimize the performance of DB2 databases, identifying bottlenecks, and implementing performance tuning strategies to ensure high availability and responsiveness.Backup & Recovery:Develop, implement, and manage backup and recovery procedures to safeguard critical data. Perform regular backups and ensure that recovery procedures are robust and well-documented.Security Management:Implement and manage database security, including user access controls, encryption, and auditing, ensuring compliance with industry standards and company policies.Problem Resolution:Troubleshoot and resolve issues related to DB2 database operations, including providing support for complex technical problems and developing root cause analysis.Data Integrity & Availability:Ensure data integrity and availability by managing replication, data migration, and data archiving processes.Documentation:Maintain comprehensive documentation of the database environment, including configurations, processes, and procedures.Collaboration:Work closely with application developers, system administrators, and other stakeholders to design and implement database solutions that meet business requirements.Disaster Recovery:Participate in the development and testing of disaster recovery plans to ensure data continuity in the event of a disaster.Compliance & Auditing:Ensure that database systems comply with regulatory requirements and participate in regular audits. Qualifications:Experience:Minimum of 5 years of experience as a DB2 Database Administrator in a mainframe environment.Technical Skills: Technical Proficiency:In-depth knowledge of DB2 for z/OS, SQL,SPUFI, IBM Data Studio, CA Tools For DB2 (RC Query, Migrator, Log Analyzer, etc.), BMC Utilities, CA Utilities, IBM Utilities, File Manager For DB2, InfoSphere CDCStrong knowledge of mainframe technologies and z/OS operating systemMainframe TechnologiesZ/OS, JCL, REXX, CLIST, ESP schedulerPerformance Monitoring ToolsIBM OMEGAMON, CA Thread Terminator, IBM Query Monitor, IBM Tivoli Enterprise Portal, Dynatrace, (other monitoring software is owned but not implemented at the moment ex Detector/Subsystem Analyzer)Version Control Systems.ChangemanSecurity ToolsRACFOperating SystemsZ/OSDB2 System Version:V12M510 & V13M100Scripting LanguagesREXX, JCL, CLISTExperience with database performance tuning and optimizationFamiliarity with backup and recovery tools and processesUnderstanding of database security best practicesExperience with disaster recovery planning and implementationSoft Skills: Strong analytical and problem-solving abilitiesExcellent communication and collaboration skillsAbility to work independently and as part of a teamAttention to detail and strong organizational skillsPreferred Qualifications:IBM Certified Database Administrator DB2 for z/OSExperience with data replication tools (e.g., Q Replication, InfoSphere)Knowledge of automation tools and scripting languages (e.g., REXX, JCL)Additional Information: The candidate should have a minimum of 3 years of experience in DB2 for Mainframe Database Administrator This position is based at our Pune office. A 15 years full-time education is required.
Posted 1 month ago
5 - 9 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring they align with business needs and standards. You will also engage with multiple teams, contribute to key decisions, and provide problem-solving solutions for your team and across multiple teams. With your creativity and expertise in IBM InfoSphere DataStage, you will play a crucial role in developing efficient and effective applications. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, develop, and test applications using IBM InfoSphere DataStage. Collaborate with business analysts and stakeholders to gather requirements. Ensure applications meet business process and application requirements. Troubleshoot and debug applications to resolve issues. Create technical documentation for reference and reporting purposes. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration. Experience in designing and implementing data integration solutions. Knowledge of SQL and database concepts. Experience with data warehousing and data modeling. Good To Have Skills:Experience with IBM InfoSphere Information Server. Familiarity with other ETL tools such as Informatica or Talend. Additional Information: The candidate should have a minimum of 5 years of experience in IBM InfoSphere DataStage. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
2 - 6 years
12 - 16 Lacs
Pune
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Led the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Very good experience on Continuous Flow Graph tool used for point based development Design, develop, and maintain ETL processes using Ab Initio tools. Write, test, and deploy Ab Initio graphs, scripts, and other necessary components. Troubleshoot and resolve data processing issues and improve performance. Data IntegrationExtract, transform, and load data from various sources into data warehouses, operational data stores, or other target systems Work with different data formats, including structured, semi-structured, and unstructured data Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
2 - 5 years
7 - 11 Lacs
Mumbai
Work from Office
Who you areA highly skilled Data Engineer specializing in Data Modeling with experience in designing, implementing, and optimizing data structures that support the storage, retrieval and processing of data for large-scale enterprise environments. Having expertise in conceptual, logical, and physical data modeling, along with a deep understanding of ETL processes, data lake architectures, and modern data platforms. Proficient in ERwin, PostgreSQL, Apache Iceberg, Cloudera Data Platform, and Denodo. Possess ability to work with cross-functional teams, data architects, and business stakeholders ensures that data models align with enterprise data strategies and support analytical use cases effectively. What you’ll doAs a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Architecture Designing and developing conceptual, logical, and physical data models to support data migration from IIAS to Cloudera Data Lake. Creating and optimizing data models for structured, semi-structured, and unstructured data stored in Apache Iceberg tables on Cloudera. Establishing data lineage and metadata management for the new data platform. Implementing Denodo-based data virtualization models to ensure seamless data access across multiple sources. Data Governance & Quality Ensuring data integrity, consistency, and compliance with regulatory standards, including Banking/regulatory guidelines. Implementing Talend Data Quality (DQ) solutions to maintain high data accuracy. Defining and enforcing naming conventions, data definitions, and business rules for structured and semi-structured data. ETL & Data Pipeline Optimization Supporting the migration of ETL workflows from IBM DataStage to PySpark, ensuring models align with the new ingestion framework. Collaborating with data engineers to define schema evolution strategies for Iceberg tables. Ensuring performance optimization for large-scale data processing on Cloudera. Collaboration & Documentation Working closely with business analysts, architects, and developers to translate business requirements into scalable data models. Documenting data dictionary, entity relationships, and mapping specifications for data migration. Supporting reporting and analytics teams (Qlik Sense/Tableau) by providing well-structured data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in data modeling, database design, and data engineering. Hands-on experience with ERwin Data Modeler for creating and managing data models. Strong knowledge of relational databases (PostgreSQL) and big data platforms (Cloudera, Apache Iceberg). Proficiency in SQL and NoSQL database concepts. Understanding of data governance, metadata management, and data security principles. Familiarity with ETL processes and data pipeline optimization. Strong analytical, problem-solving, and documentation skills. Preferred technical and professional experience Experience working on Cloudera migration projects. Exposure to Denodo for data virtualization and Talend DQ for data quality management. Knowledge of Kafka, Airflow, and PySpark for data processing. Familiarity with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. Certifications in Data Modeling, Cloudera Data Engineering, or IBM Data Solutions.
Posted 1 month ago
2 - 5 years
7 - 11 Lacs
Mumbai
Work from Office
Who you areA highly skilled Data Engineer specializing in Data Modeling with experience in designing, implementing, and optimizing data structures that support the storage, retrieval and processing of data for large-scale enterprise environments. Having expertise in conceptual, logical, and physical data modeling, along with a deep understanding of ETL processes, data lake architectures, and modern data platforms. Proficient in ERwin, PostgreSQL, Apache Iceberg, Cloudera Data Platform, and Denodo. Possess ability to work with cross-functional teams, data architects, and business stakeholders ensures that data models align with enterprise data strategies and support analytical use cases effectively. What you’ll doAs a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Architecture Designing and developing conceptual, logical, and physical data models to support data migration from IIAS to Cloudera Data Lake. Creating and optimizing data models for structured, semi-structured, and unstructured data stored in Apache Iceberg tables on Cloudera. Establishing data lineage and metadata management for the new data platform. Implementing Denodo-based data virtualization models to ensure seamless data access across multiple sources. Data Governance & Quality Ensuring data integrity, consistency, and compliance with regulatory standards, including Banking/regulatory guidelines. Implementing Talend Data Quality (DQ) solutions to maintain high data accuracy. Defining and enforcing naming conventions, data definitions, and business rules for structured and semi-structured data. ETL & Data Pipeline Optimization Supporting the migration of ETL workflows from IBM DataStage to PySpark, ensuring models align with the new ingestion framework. Collaborating with data engineers to define schema evolution strategies for Iceberg tables. Ensuring performance optimization for large-scale data processing on Cloudera. Collaboration & Documentation Working closely with business analysts, architects, and developers to translate business requirements into scalable data models. Documenting data dictionary, entity relationships, and mapping specifications for data migration. Supporting reporting and analytics teams (Qlik Sense/Tableau) by providing well-structured data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Preferred technical and professional experience Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Familiarity with graph databases (DGraph Enterprise) for data relationships. Experience with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. IBM, Cloudera, or AWS/GCP certifications in Data Engineering or Data Modeling.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Datastage is a popular ETL (Extract, Transform, Load) tool used by organizations to extract data from different sources, transform it, and load it into a target data warehouse. The demand for datastage professionals in India has been on the rise due to the increasing reliance on data-driven decision-making by companies across various industries.
These cities are known for their vibrant tech industries and have a high demand for datastage professionals.
The average salary range for datastage professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
In the datastage field, a typical career progression may look like: - Junior Developer - ETL Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise in datastage, they can move up the ladder to more senior and leadership roles.
In addition to proficiency in datastage, employers often look for candidates with the following skills: - SQL - Data warehousing concepts - ETL tools like Informatica, Talend - Data modeling - Scripting languages like Python or Shell scripting
Having a diverse skill set can make a candidate more competitive in the job market.
As you explore job opportunities in Datastage in India, remember to showcase your skills and knowledge confidently during interviews. By preparing well and demonstrating your expertise, you can land a rewarding career in this growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2