Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5 - 10 years
7 - 12 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Conduct regular team meetings to discuss progress and challenges Stay updated on industry trends and best practices Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio Strong understanding of ETL processes Experience with data integration and data warehousing Knowledge of data quality and data governance principles Hands-on experience with Ab Initio GDE and EME tools Additional Information: The candidate should have a minimum of 5 years of experience in Ab Initio This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Architect, you will provide functional and/or technical expertise to plan, analyze, define, and support the delivery of future functional and technical capabilities for an application or group of applications. You will also assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead the design and implementation of application solutions. Ensure compliance with architectural standards and guidelines. Identify opportunities to improve application performance and scalability. Mentor junior team members to enhance their skills. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of cloud-based data analytics solutions. Experience in designing and implementing scalable data architectures. Knowledge of data governance and security best practices. Hands-on experience with data integration and ETL processes. Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Informatica MDM, java Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will collaborate with the team to ensure successful project delivery and contribute to key decisions. Your typical day will involve analyzing business requirements, designing application solutions, and providing recommendations for problem-solving across multiple teams. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Design and develop application solutions based on business requirements Conduct code reviews and ensure adherence to coding standards Collaborate with cross-functional teams to gather requirements and provide technical guidance Professional & Technical Skills: Must To Have Skills:Proficiency in Informatica MDM Experience with data integration and data quality management Strong understanding of data modeling and database concepts Hands-on experience in designing and developing ETL workflows Solid grasp of data governance and data management best practices Additional Information: The candidate should have a minimum of 5 years of experience in Informatica MDM This position is based at our Hyderabad office A 15 years full time education is required Qualifications 15 years full time education
Posted 1 month ago
2 - 7 years
4 - 9 Lacs
Gurugram
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica MDM Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of projects and contribute to the overall success of the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather and analyze business requirements.- Design, develop, and test applications using Informatica MDM.- Troubleshoot and debug application issues to ensure optimal performance.- Implement data integration and data quality processes.- Create and maintain technical documentation for reference and reporting purposes. Professional & Technical Skills:- Must To Have Skills:Proficiency in Informatica MDM.- Strong understanding of data integration and data quality concepts.- Experience with ETL tools and processes.- Knowledge of relational databases and SQL.- Experience in data modeling and data mapping.- Good To Have Skills:Experience with Informatica PowerCenter. Additional Information:- The candidate should have a minimum of 2 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
7 - 12 years
9 - 14 Lacs
Hyderabad
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Syniti ADM for SAP Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary:As a Software Development Lead, you will be responsible for developing and configuring software systems, applying knowledge of technologies, methodologies, and tools to support clients or projects. You will lead a team in delivering high-quality solutions and ensuring project success. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior team members- Drive innovation and continuous improvement within the team Professional & Technical Skills:- Must To Have Skills:Proficiency in Syniti ADM for SAP- Strong understanding of data migration and data management processes- Experience in SAP data migration projects- Knowledge of SAP data governance and data quality management- Hands-on experience in SAP data integration and transformation Additional Information:- The candidate should have a minimum of 7.5 years of experience in Syniti ADM for SAP- This position is based at our Hyderabad office- A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Ahmedabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BTP Integration Suite, SAP CPI for Data Services, SAP PO/PI & APIs Development Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Develop and implement SAP PO/PI & APIs for seamless data integration. Design and configure applications using SAP CPI for Data Services. Collaborate with cross-functional teams to ensure application functionality. Provide technical expertise and support for application development. Contribute to continuous improvement initiatives for application efficiency. Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BTP Integration Suite, SAP PO/PI & APIs Development, SAP CPI for Data Services. Strong understanding of integration concepts and best practices. Experience in developing and implementing data integration solutions. Knowledge of SAP cloud platform services and technologies. Hands-on experience in troubleshooting and resolving technical issues. Additional Information: The candidate should have a minimum of 3 years of experience in SAP BTP Integration Suite. This position is based at our Ahmedabad office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 month ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Celonis Process Mining Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented effectively to optimize business processes and improve efficiency. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, build, and configure applications to meet business process and application requirements. Collaborate with cross-functional teams to gather and analyze business requirements. Develop and implement software solutions using the Celonis Process Mining Platform. Perform unit testing and debugging to ensure the quality and functionality of the applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Celonis Process Mining Platform. Strong understanding of business process analysis and optimization. Experience in designing and implementing applications using Celonis Process Mining Platform. Knowledge of data modeling and database design principles. Experience with data integration and ETL processes. Good To Have Skills:Experience with other process mining platforms. Additional Information: The candidate should have a minimum of 5 years of experience in Celonis Process Mining Platform. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
5 - 10 years
7 - 12 Lacs
Mumbai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the effort to design, build, and configure applications Act as the primary point of contact Manage the team and ensure successful project delivery Professional & Technical Skills: Must To Have Skills:Technical Proficiency in SAP Master Data Governance MDG Tool Strong understanding of data governance principles and best practices Experience in implementing and configuring SAP MDG Tool Knowledge of data modeling and data integration concepts Hands-on experience in data quality management and data cleansing techniques Additional Information: The candidate should have a minimum of 5 years of experience in SAP Master Data Governance MDG Tool This position is based in Mumbai A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
7 - 11 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Any Graduation Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SAP BusinessObjects Data Services. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, develop, and maintain SAP BusinessObjects Data Services solutions to meet business requirements. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Provide technical support and troubleshooting for SAP BusinessObjects Data Services solutions. Professional & Technical Skills: Must To Have Skills:Expertise in SAP BusinessObjects Data Services. Good To Have Skills:Experience with SAP HANA, SAP BW, and SAP ECC. Strong understanding of ETL processes and data integration. Experience with SQL and database management systems. Experience with data modeling and data warehousing concepts. Additional Information: The candidate should have a minimum of 7.5 years of experience in SAP BusinessObjects Data Services. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Hyderabad office. Qualification Any Graduation
Posted 1 month ago
3 - 8 years
3 - 7 Lacs
Pune, Bengaluru
Work from Office
locationsPune - WestBangalore Fortune Summit time typeFull time posted onPosted 3 Days Ago job requisition idJR-0009000 Middle Office - Analyst - Business Systems - Permanent LocationPune Experience3 - 6 years DesignationAssociate Industry/DomainETL/Mapping Tool, VBA, SQL, Capital Market knowledge, Bank Debts, Solvas Apex Group Ltd has an immediate requirement for Middle Office Tech Specialist. As an ETL Techno-Functional Support Specialist at Solvas, you will be the bridge between technical ETL processes and end-users, ensuring the effective functioning and support of data integration solutions. Your role involves addressing user queries, providing technical support for ETL-related issues, and collaborating with both technical and non-technical teams to ensure a seamless data integration environment. You will contribute to the development, maintenance, and enhancement of ETL processes for solvas application, ensuring they align with business requirements. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Functional/ Business Expertise Required Serve as the primary point of contact for end-users seeking technical assistance related to Solvas applications. Serve as a point of contact for end-users, addressing queries related to ETL processes, data transformations, and data loads. Provide clear and concise explanations to non-technical users regarding ETL functionalities and troubleshoot issues. Integrate Client Trade files into the Conversant systemdesign, develop, implement, and test technical solutions based on client and business requirements. Diagnose and troubleshoot ETL-related issues reported by end-users or identified through monitoring systems. Work closely with business analysts and end-users to understand and document ETL requirements. Monitor ETL jobs and processes to ensure optimal performance and identify potential issues. Create user documentation and guides to facilitate self-service issue resolution. Hands on experience in working on any ETL tools is mandatory . Strong command of SQL, VBA and Advance Excel. Good understanding of Solvas or any other loan operation system . Mandatory to have good knowledge of Solvas Bank Debt working . Intermediate knowledge of financial instruments, both listed and unlisted or OTCs , which includes and not limited to derivatives, illiquid stocks, private equity, bank-debts, and swaps. Understanding of the Loan operation industry is necessary. Should have knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Proficiency in any loan operation system, preferably solvas. An ability to work under pressure with changing priorities. Strong analytical and problem -solving skills. Experience and Knowledge: 3+ years of related experience in support/ technical in any loan operation system & accounting system (Solvas/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Good communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 month ago
5 - 9 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring they align with business needs and standards. You will also engage with multiple teams, contribute to key decisions, and provide problem-solving solutions for your team and across multiple teams. With your creativity and expertise in IBM InfoSphere DataStage, you will play a crucial role in developing efficient and effective applications. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, develop, and test applications using IBM InfoSphere DataStage. Collaborate with business analysts and stakeholders to gather requirements. Ensure applications meet business process and application requirements. Troubleshoot and debug applications to resolve issues. Create technical documentation for reference and reporting purposes. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration. Experience in designing and implementing data integration solutions. Knowledge of SQL and database concepts. Experience with data warehousing and data modeling. Good To Have Skills:Experience with IBM InfoSphere Information Server. Familiarity with other ETL tools such as Informatica or Talend. Additional Information: The candidate should have a minimum of 5 years of experience in IBM InfoSphere DataStage. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
7 - 12 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have skills :SAP Master Data Governance MDG Tool Good to have skills :NA Minimum 7.5 year(s) of experience is required Educational Qualification :15 years full time education Summary:As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are aligned with the needs of the organization and contribute to its overall success. Your typical day will involve collaborating with the team, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. You will also engage with various stakeholders and contribute to important decisions that impact the project's success. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Ensure effective communication and coordination within the team Identify and address any issues or challenges that arise during the project Stay updated with the latest industry trends and technologies Professional & Technical Skills: Must To Have Skills:Proficiency in SAP Master Data Governance MDG Tool Strong understanding of data management principles and best practices Experience in designing and implementing data governance solutions Knowledge of SAP MDG configuration and customization Hands-on experience in data modeling and data integration Familiarity with SAP ERP systems and their integration with MDG Good To Have Skills:Experience with SAP S/4HANA and Fiori apps Experience in data migration and data quality management Additional Information: The candidate should have a minimum of 7.5 years of experience in SAP Master Data Governance MDG Tool A 15 years full time education is required Qualifications 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Btech Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze requirements. - Design, develop, and test applications based on business needs. - Troubleshoot and debug applications to ensure optimal performance. - Implement security and data protection measures. - Document technical specifications and user guides. - Stay up-to-date with emerging technologies and industry trends. Professional & Technical Skills: - Must To Have Skills:Proficiency in Google BigQuery. - Strong understanding of SQL and database concepts. - Experience with data modeling and schema design. - Knowledge of ETL processes and data integration techniques. - Familiarity with cloud platforms such as Google Cloud Platform. - Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Hyderabad office. - A Btech degree is required. Qualifications Btech
Posted 1 month ago
5 - 10 years
15 - 20 Lacs
Bengaluru
Work from Office
locationsIndia, Bangalore time typeFull time posted onPosted 30+ Days Ago job requisition idJR0034276 Job Title: Big Data Architect About Skyhigh Security Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the worlds data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks' learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on and Twitter . Role Overview: The Big Data Architect will be responsible for the design, implementation, and management of the organizations big data infrastructure. The ideal candidate will have a strong technical background in big data technologies, excellent problem-solving skills, and the ability to work in a fast-paced environment. The role requires a deep understanding of data architecture, data modeling, and data integration techniques. About the Role: Design and implement scalable and efficient big data architecture solutions to meet business requirements. Develop and maintain data pipelines, ensuring the availability and quality of data. Collaborate with data scientists, data engineers, and other stakeholders to understand data needs and provide technical solutions. Lead the evaluation and selection of big data tools and technologies. Ensure data security and privacy compliance. Optimize and tune big data systems for performance and cost-efficiency. Document data architecture, data flows, and processes. Stay up-to-date with the latest industry trends and best practices in big data technologies. About You: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. over all 10+ years exp with 5+ years of experience in big data architecture and engineering. Proficiency in big data technologies such as Hadoop mapredue, Spark batch and streaming, Kafka, HBase, Scala, Elastic Search and others. Experience with AWS cloud platform. Strong knowledge of data modeling, ETL processes, and data warehousing. Proficiency in programming languages such as Java, Scala, Spark Familiarity with data visualization tools and techniques. Excellent communication and collaboration skills. Strong problem-solving abilities and attention to detail. Company Benefits and Perks: We work hard to embrace diversity and inclusion and encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious about our commitment to diversity which is why we prohibit discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.
Posted 1 month ago
8 - 13 years
4 - 8 Lacs
Gurugram
Work from Office
remote typeOn-site locationsGurugram, HR time typeFull time posted onPosted 5 Days Ago job requisition idREQ401285 Senior ETL Developer What this job involves Are you comfortable working independently without close supervision? We offer an exciting role where you can enhance your skills and play a crucial part in delivering consistent, high-quality administrative and support tasks for the EPM team The Senior ETL Developer/SSIS Administrator will lead the design of logical data models for JLL's EPM Landscape system. This role is responsible for implementing physical database structures and constructs, as well as developing operational data stores and data marts. The role entails developing and fine-tuning SQL procedures to enhance system performance. The individual will support functional tasks of medium-to-high technological complexity and build SSIS packages and transformations to meet business needs. This position contributes to maximizing the value of SSIS within the organization and collaborates with cross-functional teams to align data integration solutions with business objectives. Responsibilities The Senior ETL Developer will be responsible for: Gathering requirements and processing information to design data transformations that will effectively meet end-user needs. Designing, developing, and testing ETL processes for large-scale data extraction, transformation, and loading from source systems to the Data Warehouse and Data Marts. Creating SSIS packages to clean, prepare, and load data into the data warehouse and transfer data to EPM, ensuring data integrity and consistency throughout the ETL process. Monitoring and optimizing ETL performance and data quality. Creating routines for importing data using CSV files. Mapping disparate data sources - relational DBs, text files, Excel files - onto the target schema. Scheduling the packages to extract data at specific time intervals. Planning, coordinating, and supporting ETL processes, including architecting table structure, building ETL processes, documentation, and long-term preparedness. Extracting complex data from multiple data sources into usable and meaningful reports and analyses by implementing PL/SQL queries. Ensuring that the data architecture is scalable and maintainable. Troubleshooting data integration and data quality issues and bugs, analyzing reasons for failure, implementing optimal solutions, and revising procedures and documentation as needed. Utilizing hands-on SQL features Stored Procedures, Indexes, Partitioning, Bulk loads, DB configuration, Security/Roles, Maintenance. Developing queries and procedures, creating custom reports/views, and assisting in debugging. The developer will also be responsible for designing SSIS packages and ensuring their stability, reliability, and performance. Sounds like you? To apply, you need to have: 8+ years of experience in Microsoft SQL Server Management Studioadministration and development Bachelors degree or equivalent Competency in Microsoft Office and Smart View Experience with Microsoft SQL databases and SSIS / SSAS development. Experience working with Microsoft SSIS to create and deploy packages and deploy for ETL processes. Experience in writing and troubleshooting SQL statements, creating stored procedures, views, and SQL functions Experience with data analytics and development. Strong SQL coding experience with performance optimization experience for data queries. Experience creating and supporting SSAS Cubes. Knowledge of Microsoft PowerShell and Batch scripting Good to have Power BI development experience Strong critical and analytical thinking and problem-solving skills Ability to multi-task and thrive in fast-paced, rapidly changing, and complex environment Good written and verbal communication skills Ability to learn new skills quickly to make a measurable difference Strong team player - proven success in contributing to a team-oriented environment Excellent communication (written and oral) and interpersonal skills Excellent troubleshooting and problem resolution skills What we can do for you At JLL, we make sure that you become the best version of yourself by helping you realize your full potential in an entrepreneurial and inclusive work environment. We will empower your ambitions through our dedicated Total Rewards Program, competitive pay, and benefits package. Apply today! Location On-site Gurugram, HR Scheduled Weekly Hours 40 Job Tags: . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.
Posted 1 month ago
8 - 13 years
13 - 18 Lacs
Pune
Work from Office
Position Summary We are looking for a highly skilled and experienced Data Engineering Manager to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Solution Architecture & Review Expertise in conceptualizing solution architecture and low-level design in a range of data engineering (Matillion, Informatica, Talend, Python, dbt, Airflow, Apache Spark, Databricks, Redshift) and cloud hosting (AWS, Azure) technologies Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Should have expertise and 8+ years of working experience in at least twoETL toolsamong Matillion, dbt, pyspark, Informatica, and Talend Should have expertise and working experience in at least twodatabases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware oftechniques such asData Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Hands-on in scrum methodology (Sprint planning, execution and retrospection) Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Agile PySpark Data Modelling Matillion Designing technical architecture AWS Data Pipeline
Posted 1 month ago
2 - 5 years
14 - 17 Lacs
Mumbai
Work from Office
Who you are A seasoned Data Engineer with a passion for building and managing data pipelines in large-scale environments. Have good experience working with big data technologies, data integration frameworks, and cloud-based data platforms. Have a strong foundation in Apache Spark, PySpark, Kafka, and SQL.What you’ll doAs a Data Engineer – Data Platform Services, your responsibilities include: Data Ingestion & Processing Assisting in building and optimizing data pipelines for structured and unstructured data. Working with Kafka and Apache Spark to manage real-time and batch data ingestion. Supporting data integration using IBM CDC and Universal Data Mover (UDM). Big Data & Data Lakehouse Management Managing and processing large datasets using PySpark and Iceberg tables. Assisting in migrating data workloads from IIAS to Cloudera Data Lake. Supporting data lineage tracking and metadata management for compliance. Optimization & Performance Tuning Helping to optimize PySpark jobs for efficiency and scalability. Supporting data partitioning, indexing, and caching strategies. Monitoring and troubleshooting pipeline issues and performance bottlenecks. Security & Compliance Implementing role-based access controls (RBAC) and encryption policies. Supporting data security and compliance efforts using Thales CipherTrust. Ensuring data governance best practices are followed. Collaboration & Automation Working with Data Scientists, Analysts, and DevOps teams to enable seamless data access. Assisting in automation of data workflows using Apache Airflow. Supporting Denodo-based data virtualization for efficient data access. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks. Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics.
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your primary responsibilities include: Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 month ago
2 - 5 years
14 - 17 Lacs
Mumbai
Work from Office
Who you are: A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks. Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics.
Posted 1 month ago
2 - 6 years
12 - 16 Lacs
Pune
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Led the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Very good experience on Continuous Flow Graph tool used for point based development Design, develop, and maintain ETL processes using Ab Initio tools. Write, test, and deploy Ab Initio graphs, scripts, and other necessary components. Troubleshoot and resolve data processing issues and improve performance. Data IntegrationExtract, transform, and load data from various sources into data warehouses, operational data stores, or other target systems Work with different data formats, including structured, semi-structured, and unstructured data Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Chennai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio. Data Modeling and Analysis:. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes. Analyze and model data to ensure optimal ETL design and performance. Ab Initio Components:. . Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components Preferred technical and professional experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration:. . Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
Posted 1 month ago
2 - 5 years
14 - 17 Lacs
Mumbai
Work from Office
Who you areA Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks. Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics..
Posted 1 month ago
5 - 10 years
6 - 10 Lacs
Hyderabad
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back-end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong SAS ETL programming skills with 5+ years of relevant SAS ETL programming experience in a banking project. Have strong knowledge in areas of SAS development Preferred technical and professional experience Have strong knowledge in areas of SAS development, Banking domain
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the team in implementing efficient coding practices Conduct code reviews and provide constructive feedback Stay updated on industry trends and best practices Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio Strong understanding of ETL processes Experience with data integration and data warehousing Knowledge of data quality and data governance principles Hands-on experience in developing and maintaining applications using Ab Initio Additional Information: The candidate should have a minimum of 5 years of experience in Ab Initio This position is based at our Pune office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
3 - 8 years
9 - 13 Lacs
Chennai
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects. Ensure cohesive integration between systems and data models. Implement data platform components. Troubleshoot and resolve data platform issues. Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of data platform blueprint and design. Experience with data integration and data modeling. Hands-on experience with data platform components. Knowledge of data platform security and governance. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Chennai office. A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France