Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions.- Develop and implement software solutions to meet business needs.- Conduct code reviews and ensure compliance with coding standards.- Troubleshoot and debug applications to enhance performance and functionality.- Stay updated on industry trends and best practices to continuously improve technical skills. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data warehousing concepts and methodologies.- Hands-on experience in developing and maintaining data pipelines.- Knowledge of SQL and database management systems. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
About The Role Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : Salesforce Marketing Cloud Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Integration Engineer, you will provide consultative Business and System Integration services to assist clients in implementing effective solutions. Your typical day will involve engaging with clients to understand their needs, facilitating discussions on transformation and the customer journey, and ensuring that the technology and business solutions align with their requirements. You will work collaboratively with various stakeholders to translate customer needs into actionable business and technology strategies, driving the successful implementation of solutions that enhance client satisfaction and operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate workshops and meetings to gather requirements and present solutions.- Develop and maintain documentation related to integration processes and solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Salesforce Marketing Cloud.- Strong understanding of integration methodologies and best practices.- Experience with API management and data integration tools.- Ability to analyze and troubleshoot integration issues effectively.- Familiarity with customer relationship management systems. Additional Information:- The candidate should have minimum 3 years of experience in Salesforce Marketing Cloud.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to ensure successful project delivery.- Conduct code reviews and provide technical guidance to team members.- Troubleshoot and resolve application issues in a timely manner.- Stay updated on industry trends and best practices to enhance application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of ETL processes and data integration.- Experience with data modeling and database design.- Knowledge of SAP BusinessObjects reporting tools.- Hands-on experience in developing and optimizing data workflows.- Good To Have Skills: Experience with SAP HANA. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function smoothly and efficiently. You will also engage in problem-solving discussions and contribute to the overall improvement of application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Good To Have Skills: Experience with ETL processes and data integration.- Strong understanding of application development methodologies.- Familiarity with database management systems and SQL.- Experience in troubleshooting and debugging applications. Additional Information:- The candidate should have minimum 3 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support data initiatives.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality assessment and improvement methodologies.- Familiarity with data governance principles and best practices.- Ability to work with large datasets and perform data cleansing. Additional Information:- The candidate should have minimum 3 years of experience in Informatica Data Quality.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
10.0 - 14.0 years
12 - 16 Lacs
Gurugram
Work from Office
Your primary focus is providing comprehensive support for our globally Microsoft Power BI platform.Youll be the initial contact for all topics related to Microsoft Power BIYou have a strong Microsoft Power BI skills, ideally including DAX proficiency, Data Integration (Data Source Connection, Data Query), SQL Knowledge
Posted 1 month ago
4.0 - 6.0 years
12 - 18 Lacs
Chennai, Bengaluru
Work from Office
Key Skills : Python, SQL, PySpark, Databricks, AWS, Data Pipeline, Data Integration, Airflow, Delta Lake, Redshift, S3, Data Security, Cloud Platforms, Life Sciences. Roles & Responsibilities : Develop and maintain robust, scalable data pipelines for ingesting, transforming, and optimizing large datasets from diverse sources. Integrate multi-source data into performant, query-optimized formats such as Delta Lake, Redshift, and S3. Tune data processing jobs and storage layers to ensure cost efficiency and high throughput. Automate data workflows using orchestration tools like Airflow and Databricks APIs for ingestion, transformation, and reporting. Implement data validation and quality checks to ensure reliable and accurate data. Manage and optimize AWS and Databricks infrastructure to support scalable data operations. Lead cloud platform migrations and upgrades, transitioning legacy systems to modern, cloud-native solutions. Enforce security best practices, ensuring compliance with regulatory standards such as IAM and data encryption. Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders to deliver data solutions. Experience Requirement : 4-6 years of hands-on experience in data engineering with expertise in Python, SQL, PySpark, Databricks, and AWS. Strong background in designing and building data pipelines, and optimizing data storage and processing. Proficiency in using cloud services such as AWS (S3, Redshift, Lambda) for building scalable data solutions. Hands-on experience with containerized environments and orchestration tools like Airflow for automating data workflows. Expertise in data migration strategies and transitioning legacy data systems to modern cloud platforms. Experience with performance tuning, cost optimization, and lifecycle management of cloud data solutions. Familiarity with regulatory compliance (GDPR, HIPAA) and security practices (IAM, encryption). Experience in the Life Sciences or Pharma domain is highly preferred, with an understanding of industry-specific data requirements. Strong problem-solving abilities with a focus on delivering high-quality data solutions that meet business needs. Education : Any Graduation.
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled Ab Initio Developer with over 6 years of total experience and at least 4 years of relevant experience. Your primary responsibilities will include leading the design, development, and optimization of ETL processes using Ab Initio's Graphical Development Environment (GDE). It is essential to ensure data accuracy, consistency, and availability throughout the data integration workflows. You will be tasked with building, maintaining, and optimizing data integration workflows to facilitate seamless data flow across various systems and platforms. Your expertise in designing intricate data transformations, data cleansing, and data enrichment logic within Ab Initio graphs will be critical. Utilizing Ab Initio's metadata capabilities for documenting data lineage, transformations, and data definitions is essential to ensure transparency and compliance. Monitoring and optimizing Ab Initio ETL processes for efficiency, scalability, and performance will be part of your routine. You must address and resolve any bottlenecks that may arise. Developing robust error handling and logging mechanisms to track and manage ETL job failures and exceptions is crucial to maintain the integrity of data processes. Collaboration with cross-functional teams, including data engineers, data analysts, data scientists, and business stakeholders, is necessary. Understanding requirements and ensuring successful delivery of data integration projects will be a key aspect of your role. Using version control systems such as Git to manage Ab Initio code and collaborate effectively with team members is essential. Creating and maintaining comprehensive documentation of Ab Initio graphs, data integration processes, best practices, and standards for the team is expected. You will also be responsible for investigating and resolving complex ETL-related issues, providing support to team members and users, and conducting root cause analysis when problems arise. Overall, as an Ab Initio Developer, you will be a vital part of the data engineering team, contributing to the design, development, and maintenance of data integration and ETL solutions using Ab Initio's suite of tools.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
telangana
On-site
We are seeking a leader who can bring a product mindset to take ownership of, manage, scale, and curate customer data generated in third-party applications. The primary goal is to make this data useful, relevant, and accessible not only to our many analyst teams but also to integrate it with other applications. It is crucial to unlock the power within our application data by understanding how analysts and other systems will consume the data. Fortunately, we have a robust framework in place to support the existing data model and can adapt to updated models as our business evolves. The individual in this role will have a significant influence on how we represent customer data in multiple systems. As a candidate for this role, you should be empathetic to the analytic needs of business stakeholders, product owners, engineers, and analysts. You must possess the ability to translate data requirements into practical solutions that cater to those needs. Adherence to established protocols while being open to adapt to changes is essential. You should be willing to enhance existing systems, provide recommendations, and challenge assumptions when necessary. A deep passion for accuracy and validation to ensure the correctness of transformations is crucial. The ideal candidate should be action and results-oriented, capable of leading work independently and collaboratively across teams. Your daily responsibilities will include communicating with various teams such as business teams, product owners, analysts, engineers, and other data consumers to understand data needs. You will be tasked with prioritizing new data and integration requests and developing processes to meet demand effectively. Taking ownership of solution integrations and advocating for them will be a key part of your role. Engaging with users to ensure shared success and deeper insights by anticipating their needs is essential. Moreover, you will be expected to practice good data hygiene, implement data management controls, and adhere to implementation standards. Developing operational playbooks/runbooks to support integrations and enhance consistency, scalability, and resilience will also be part of your responsibilities. You will need to establish processes that validate, automate, and scale transformations in the reporting layer. Providing regular updates to technical and non-technical stakeholders and managing and publishing your product roadmap for stakeholders are also crucial aspects of the role. To excel in this role, you should possess strong communication skills, both written and verbal, and demonstrate an ability to listen and understand the root needs across various business units. A deep desire to ensure the success of your partners while using your solutions is essential. You must have a demonstrated ability to document schemas, data models, data dictionaries, etc. Well-established personal processes to manage multiple requests from different stakeholders, maintaining clear priorities, adjusting when necessary, and always being helpful and pleasant are important traits. Experience with relational and columnar database stores, data modeling, data architecture knowledge, advanced SQL skills, and at least 4+ years of experience working with common SaaS web applications and similar technologies are required. A Bachelor's Degree or equivalent qualification is preferred.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As a highly motivated and experienced Data Engineer, you will be responsible for designing, developing, and implementing solutions that enable seamless data integration across multiple cloud platforms. Your expertise in data lake architecture, Iceberg tables, and cloud compute engines like Snowflake, BigQuery, and Athena will ensure efficient and reliable data access for various downstream applications. Your key responsibilities will include collaborating with stakeholders to understand data needs and define schemas, designing and implementing data pipelines for ingesting, transforming, and storing data. You will also be developing data transformation logic to make Iceberg tables compatible with the data access requirements of Snowflake, BigQuery, and Athena, as well as designing and implementing solutions for seamless data transfer and synchronization across different cloud platforms. Ensuring data consistency and quality across the data lake and target cloud environments will be crucial in your role. Additionally, you will be analyzing data patterns and identifying performance bottlenecks in data pipelines, implementing data optimization techniques to improve query performance and reduce data storage costs, and monitoring data lake health to proactively address potential issues. Collaboration and communication with architects, leads, and other stakeholders to ensure data quality meet specific requirements will also be an essential part of your role. To be successful in this position, you should have a minimum of 4+ years of experience as a Data Engineer, strong hands-on experience with data lake architectures and technologies, proficiency in SQL and scripting languages, and experience with data governance and security best practices. Excellent problem-solving and analytical skills, strong communication and collaboration skills, and familiarity with cloud-native data tools and services are also required. Additionally, certifications in relevant cloud technologies will be beneficial. In return, GlobalLogic offers exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. You will have the opportunity to collaborate with a diverse team of highly talented individuals in an open, laidback environment. Work-life balance is prioritized with flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional development opportunities include Communication skills training, Stress Management programs, professional certifications, and technical and soft skill trainings. GlobalLogic provides competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS(National Pension Scheme), extended maternity leave, annual performance bonuses, and referral bonuses. Fun perks such as sports events, cultural activities, food on subsidized rates, corporate parties, dedicated GL Zones, rooftop decks, and discounts for popular stores and restaurants are also part of the vibrant office culture at GlobalLogic. About GlobalLogic: GlobalLogic is a leader in digital engineering, helping brands design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise, GlobalLogic helps clients accelerate their transition into tomorrow's digital businesses. Operating under Hitachi, Ltd., GlobalLogic contributes to driving innovation through data and technology for a sustainable society with a higher quality of life.,
Posted 1 month ago
3.0 - 12.0 years
0 Lacs
kolkata, west bengal
On-site
As a Cloud DB Engineer, you will be responsible for designing and developing data pipelines to collect, transform, and store data from various sources in order to support analytics and business intelligence. Your role will involve integrating data from multiple sources, including databases, APIs, and third-party tools, to ensure consistency and accuracy across all data systems. You will also be tasked with designing, implementing, and optimizing both relational and non-relational databases to facilitate efficient storage, retrieval, and processing of data. Data modeling will be a key aspect of your responsibilities, where you will develop and maintain data models that represent data relationships and flows, ensuring structured and accessible data for analysis. In addition, you will design and implement Extract, Transform, Load (ETL) processes to clean, enrich, and load data into data warehouses or lakes. Monitoring and optimizing performance of data systems, including database query performance, data pipeline efficiency, and storage utilization, will be crucial to your role. Collaboration is essential as you will work closely with data scientists, analysts, and other stakeholders to understand data needs and ensure that data infrastructure aligns with business objectives. Implementing data quality checks and governance processes to maintain data accuracy, completeness, and compliance with relevant regulations will also be part of your responsibilities. Furthermore, creating and maintaining comprehensive documentation for data pipelines, models, and systems will be necessary to ensure transparency and efficiency in data management processes.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, developing, and maintaining applications on the AS400 platform. Your primary tasks will include writing and enhancing RPG programs to meet business requirements, utilizing SQL for database management and data manipulation, and creating CL programs for batch job processing. Collaboration with cross-functional teams to gather requirements and deliver solutions will be a key aspect of your role. Additionally, you will conduct system analysis, troubleshoot issues in existing applications, and implement data integration solutions between AS400 and other systems. Ensuring quality code through thorough testing and debugging, documenting technical specifications and user guides, and providing constructive feedback during code reviews are essential parts of the job. You will also support production releases, stay updated with AS400 technologies and best practices, and assist in project planning and estimation activities. As part of the responsibilities, mentoring junior developers on AS400 technologies, collaborating with IT support teams for system maintenance and upgrades, and adhering to quality standards in development will be expected from you. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with a minimum of 3 years of experience in AS400 development. Proficiency in RPG programming language, experience with SQL databases and querying, knowledge of CL programming and batch job processing, and familiarity with data integration tools and methods are essential qualifications. Strong problem-solving skills, attention to detail, ability to work collaboratively in a team environment, strong analytical and organizational skills, and effective communication skills (both verbal and written) are crucial for success in this position. Experience in Agile development methodologies, knowledge of additional programming languages, adaptability to new technologies, and a commitment to documentation of technical processes are considered advantageous.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Principal Stats Programmer at Sanofi, you will be responsible for providing high-quality statistical deliverables on projects within various therapeutic areas for safety analytics purposes. Your role will involve working on multiple clinical compounds and offering expertise in safety data integration for data visualization integrated solutions. Within the Safety Center of Excellence, you will play a key role in developing statistical deliverables for safety analytics on clinical projects. You will plan and execute statistical programming deliverables within studies, acting as the Programming Study Lead (SP) with limited direction from the programming project leader. Your responsibilities will include performing programming activities for all statistical deliverables within a study/project, ensuring quality control, reviewing study documents, and providing programming specifications for the Study Data Tabulation Model (SDTM) and Analysis Datasets Model (ADaM) for a study. To excel in this role, you should have experience in programming, preferably in processing clinical trial data in the pharmaceutical industry. You should possess advanced technical skills in statistical programming, with knowledge in R (and SAS) and R Shiny, along with a thorough understanding of relational databases and CDISC data structure requirements. Strong communication, coordination, and problem-solving skills are essential, as well as the ability to work in a team environment and support multiple assignments with challenging timelines. Additionally, you should have a Bachelor or Master of Science degree or equivalent in Statistics, Computer Science, Mathematics, Engineering, Life Science, or a related field. Fluency in English, both written and verbal, is required for effective communication in a global environment. By joining Sanofi's Safety Center of Excellence as a Principal Statistical Programmer, you will have the opportunity to contribute to global health improvements, drive innovation in safety analytics, and advance your career in a growth-oriented and supportive environment. Embrace the challenge, innovate, and be part of a team that is dedicated to transforming the business and changing millions of lives. Join us at Sanofi and be part of a team that brings the miracles of science to life, offering endless opportunities for career growth and development, a comprehensive rewards package, and a supportive work environment that values your contributions and amplifies your impact. Experience the satisfaction of working in an international biopharma company alongside a diverse team focused on safety analytics, data integration, and data visualization across multiple clinical therapeutic areas.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Software Engineer at Clarivate, you will be an integral part of our Technology team, focusing on data engineering, ETL, and script writing to create efficient data pipelines for data cleansing and structuring processes. To excel in this role, you should hold a Bachelor's degree in computer science or possess equivalent experience. Additionally, a minimum of 3 years of Programming Experience with a strong grasp of SQL is required. Experience with ETL processes, APIs, data integration, system analysis, and design is highly valuable. You should be adept at implementing data validation and cleansing processes to maintain data integrity and proficient in pattern matching, regular expressions, XML, JSON, and other textual formats. In this position, you will be analyzing textual and binary patent data, utilizing regular expressions to extract data patterns. Writing clean, efficient, and maintainable code according to coding standards, automating tests, and unit testing all assigned tasks are key responsibilities. You will collaborate closely with Content Analysts team to design and implement mapping rules for data extraction from various file formats. Furthermore, you will liaise with cross-functional teams to understand data requirements and provide technical support. Ideally, you would have experience with cloud-based data storage and processing solutions like AWS, Azure, Google Cloud, and a strong understanding of code versioning tools such as Git. At Clarivate, you will be part of the Data Engineer team, collaborating with multiple squads comprising Data Engineers, Testers, Leads, and Release Managers to process and deliver high-quality patent data from diverse input source formats. This permanent position at Clarivate offers a hybrid work model with 9 hours of work per day, including a lunch break, providing a flexible and employee-friendly work environment. Clarivate is dedicated to promoting equal employment opportunities for all individuals in terms of hiring, compensation, promotion, training, and other employment privileges. We adhere to applicable laws and regulations to ensure non-discrimination in all locations.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As an experienced Oracle FCCS (Financial Consolidation and Close Cloud Service) Implementation Consultant, you will be responsible for leading the design, deployment, and optimization of Oracle FCCS solutions for financial consolidation, intercompany eliminations, currency translation, and financial close processes. Your expertise in consolidation accounting, statutory reporting, GAAP/IFRS compliance, financial close automation, and data integration with ERP systems will be crucial in ensuring the smooth consolidation and reporting cycles. Your key responsibilities will include: - Leading end-to-end implementation of Oracle FCCS for financial consolidation and close processes. - Configuring FCCS dimensions, metadata, security, and consolidation rules based on business requirements. - Developing intercompany elimination rules, ownership structures, and multi-currency translation logic. - Customizing forms, dashboards, task lists, and Smart View reports for financial users. - Working closely with finance and accounting teams to optimize month-end and quarter-end close cycles. - Ensuring GAAP, IFRS, and statutory compliance in financial reporting and consolidation. - Configuring Data Management (DM/FDMEE) for data integration from ERP systems (Oracle Cloud, SAP, Workday, etc.). - Developing and optimizing business rules, calculation scripts, and Groovy scripts for complex consolidation logic. - Conducting end-user training sessions for finance, accounting, and audit teams. - Collaborating with cross-functional teams to integrate FCCS with other EPM applications (EPBCS, ARCS, EDMCS). To be successful in this role, you should have a Bachelor's degree in Finance, Accounting, Business, Information Systems, or a related field, along with 3 to 6 years of hands-on experience in Oracle FCCS implementation and consolidation accounting. Additionally, possessing Oracle FCCS Certification, CPA, CA, or equivalent accounting certification would be advantageous. Your technical skills should include proficiency in Smart View, Data Management (DM/FDMEE), and Essbase cube optimization, as well as experience with REST/SOAP APIs, SQL, and ETL tools for data integration. Strong communication, problem-solving, and stakeholder management skills are essential for effective collaboration with finance and IT teams. If you are self-motivated, able to manage multiple projects in a fast-paced environment, and have exposure to project management methodologies (Agile, Scrum, or Waterfall), we encourage you to join our team. Your contributions will play a key role in delivering innovative Oracle solutions that maximize operational excellence and benefits for our clients.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
Inviting applications for the role of Manager - Global Event Tech Manager Location - Gurugram We welcome talent that will constantly push the boundaries for newer and better ways of doing things. The global event tech manager is a pivotal role in helping Genpact take growth to the next level through global marketing planning and execution designed to drive results. In this role, you will be responsible for the management and execution of in-person events (hosted & sponsored) as well as virtual events on different platforms. You must possess strong marketing, communications and program management skills, with the ability to work seamlessly with peers across the marketing function and collaborate with colleagues to achieve common growth ambitions. Responsibilities Event logistics Work with external vendors and suppliers to ensure timely and cost-effective event execution Initiate and oversee the Sourcing process on MSA, SOW, including PR/PO (purchase requests) Ensure brand compliance with all events and event materials Assist with event administration, invoicing, and logistical planning and management Support the acquisition process of branded merchandise through the online store and manage global orders from briefing to delivery Event Platform management Build and manage the project plan to support Program Owner priorities and objectives and being responsible for delivery management across all workstreams related to Genpacts events experience platforms. Oversee the day-to-day administration of Cvent, handling all the queries related to the platform setup and management, tracking and measurement Work on regular engagement reporting on the available dashboards and other types of reports by request Support on preparation of outcomes communications to various stakeholders through regular report outs Content management on the existing interface, including adding/removing content and imagery related to events, partnering with relevant teams and GStudios Maintaining guides and templates for customizing content and co-ordination with event team and GTM leaders on content changes in different environments Coordination on new deployments and work on process requirements Manage user access, permissions, and training to ensure optimal use of Cvent across the organization. Develop and maintain Cvent best practices, guidelines, and SOPs. Make recommendations on the user experience based on best practices and performance Salesforce & cross tech Integration: Design, develop, and maintain integrations between Cvent and Salesforce to ensure seamless data flow and synchronization. Work with IT or third-party developers to integrate Cvent with other systems (e.g., CRM, marketing tools) Troubleshoot and resolve integration issues promptly to minimize downtime and data discrepancies. Collaborate with the IT team to ensure secure and efficient data handling practices. Project Management: Lead projects related to Cvent implementations, upgrades, and integrations with Salesforce. Coordinate with cross teams to gather requirements, plan, and execute projects. Manage project timelines, resources, and deliverables, ensuring projects are completed on time and within budget. Strong organizational skills with the ability to manage multiple events simultaneously. Excellent verbal and written communication skills for coordinating with vendors, attendees, and internal teams Ability to present event plans and outcomes to stakeholders. Qualifications we seek in you! Minimum Qualifications / Skills Years of experience with Cvent and relevant event management platforms, with at least years in an administrative or managerial role. Certifications in Cvent (Cvent Certified Event Manager) and Salesforce consultant (e.g., Salesforce Certified Administrator, Platform App Builder). Proven experience with Salesforce, including integration experience, preferably with Apex, Salesforce APIs, or middleware tools. Application integration experience with years experience Strong understanding of data integration principles, ETL processes, and data warehousing. Knowledge of event management processes and best practices. Strong analytical and problem-solving skills. Excellent communication and executive presence to connect at C-level Creative, resourceful and takes initiative Strong project management skills Demonstrated ability to drive change, to effectively influence and motivate others,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant ETL Testing, DB Automation We are seeking a highly skilled Lead Consultant with expertise in ETL Testing and DB Automation to lead a team in delivering data-driven solutions. This role requires a deep understanding of ETL processes, data transformation, and automation frameworks. As a Lead Consultant, you will be responsible for overseeing the testing of ETL workflows, ensuring data quality, and implementing efficient database automation strategies. Responsibilities Lead the design, execution, and optimization of ETL testing strategies to ensure high-quality, accurate, and reliable data transformations and loads. Develop, implement, and maintain automated testing frameworks and scripts for database automation to streamline testing and support CI/CD processes. Collaborate closely with cross-functional teams including data engineers, developers, and business stakeholders to understand requirements and ensure alignment with project goals. Troubleshoot, identify, and resolve data discrepancies, transformation errors, and failures in the ETL process, ensuring data integrity. Mentor and guide junior team members, offering expertise and training in ETL testing methodologies and database automation best practices. Ensure that ETL processes are tested for performance, scalability, and data accuracy, covering all relevant scenarios and edge cases. Define and implement best practices for ETL testing and database automation to improve efficiency, accuracy, and overall project delivery timelines. Provide regular updates and reports to stakeholders on testing progress, risk assessments, and quality metrics, ensuring timely delivery of high-quality data solutions. Qualifications we seek in you! Minimum Qualifications Bachelor's or master's degree in computer science, Engineering, or a related field. Experience in ETL testing and database automation. Expertise in test automation tools and frameworks (e.g., Selenium, SQL-based testing tools). Strong knowledge of data integration, data transformation, and data validation processes. Proficiency in SQL and database technologies (e.g., Oracle, SQL Server, MySQL). Experience with version control systems like Git and familiarity with CI/CD tools. Preferred Qualifications Excellent analytical and problem-solving skills, with the ability to troubleshoot complex issues and drive resolution in a collaborative team environment. Exceptional communication and interpersonal skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders. Leadership experience, including leading teams, managing projects, and driving technical initiatives to successful completion. Certifications in relevant technologies (e.g., ISTQB, certifications) are a plus. Job Lead Consultant Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Apr 1, 2025, 9:15:09 PM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Senior AJO Developer, you will play a crucial role in designing, implementing, and optimizing customer journeys using Adobe Journey Optimizer (AJO). Your extensive experience in AJO will be essential in tailoring effective solutions to meet client needs, combining technical expertise with business insight. Your responsibilities will include collaborating with cross-functional teams to gather requirements, design and implement customer journeys, and ensure project success. You will focus on optimizing customer journeys for performance and user experience, troubleshooting and resolving AJO-related issues, and providing technical guidance and mentorship to junior developers. Staying current with AJO trends and best practices, you will also conduct code reviews to maintain quality and adherence to standards, while documenting technical solutions and processes. The key skills and experience required for this role include a Bachelor's degree in Computer Science, IT, or a related field, along with over 5 years of software development experience with a strong expertise in AJO. Experience in integrating AEP with other technologies would be advantageous, as well as proficiency in developing and maintaining AEP data models and integrations. Your experience with data integration, database management, excellent problem-solving skills, and attention to detail will be critical in ensuring success in this role.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
The ideal candidate for the Azure Data Engineer position will have 4-6 years of experience in designing, implementing, and maintaining data solutions on Microsoft Azure. As an Azure Data Engineer at our organization, you will be responsible for designing efficient data architecture diagrams, developing and maintaining data models, and ensuring data integrity, quality, and security. You will also work on data processing, data integration, and building data pipelines to support various business needs. Your role will involve collaborating with product and project leaders to translate data needs into actionable projects, providing technical expertise on data warehousing and data modeling, as well as mentoring other developers to ensure compliance with company policies and best practices. You will be expected to maintain documentation, contribute to the company's knowledge database, and actively participate in team collaboration and problem-solving activities. We are looking for a candidate with a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Data Engineer focusing on Microsoft Azure. Proficiency in SQL and experience with Azure data services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics is required. Strong understanding of data architecture, data modeling, data integration, ETL/ELT processes, and data security standards is essential. Excellent problem-solving, collaboration, and communication skills are also important for this role. As part of our team, you will have the opportunity to work on exciting projects across various industries like High-Tech, communication, media, healthcare, retail, and telecom. We offer a collaborative environment where you can expand your skills by working with a diverse team of talented individuals. GlobalLogic prioritizes work-life balance and provides professional development opportunities, excellent benefits, and fun perks for its employees. Join us at GlobalLogic, a leader in digital engineering, where we help brands design and build innovative products, platforms, and digital experiences for the modern world. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
HI, Greetings from Savantis! We are hiring for one of our clients. Please go through the below details and let us know if you are interested in this opportunity. Job Title: EPM Analyst Experience: 3-6 years of experience Job Location: Hybrid/ Remote Job Description EPM software (development/support of a financial data model). Helping users get the most out of their EPM and BI applications. Maintenance of user administration. Maintain and further develop data models and reports. Back-end BI work such as connecting different data sources and setting up integrations. Modeling data into usable data Front-end BI tools such as Tableau or Power BI. Retrieving business requirements and setting up reports and dashboards based on data models. Adjusting schedules and reports and making sure that these adjustments go live without problems. Preparing the system so the users can start financial processes like month-end closing or forecast. Helping users by analyzing and solving functional issues. Advising our users on performance issues. Experience on one or more of the subjects below. Financial reporting systems, such as Anaplan, Fluence, Vena, OneStream, SAP BPC or Group Reporting, CCH Tagetik or Oracle EPM. Microsoft Stack (Excel, SQL, SSAS, Power BI) SAP BI/BW and SAC Agile way of working Frameworks like ITIL, ASL, BISL. Mandatory Skills: EPM Software: Experience with financial reporting systems like Anaplan, Fluence, Vena, OneStream, SAP BPC, CCH Tagetik, Oracle EPM, or Group Reporting. BI & Data Modeling: Power BI, Tableau, SAP BI/BW, SAC. Microsoft Stack: Excel, SQL, SSAS, Power BI. Financial Data Modeling: Development and support of financial models. Data Integration: Connecting data sources, setting up integrations. System Maintenance & Support: User administration, troubleshooting, and advising on performance issues. Agile & IT Frameworks: ITIL, ASL, BISL methodologies. If you are interested in this opportunity, please share your updated CV. Thanks & Regards, Lakshmi Tulasi Kanna HR - Recruiter M: lakshmitulasi.kanna@savantis.com,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Location Pune/Nagpur Need immediate joiner only Job Description Key Responsibilities : Design, develop, and maintain robust data pipelines and ETL processes using Snowflake and other cloud technologies. Work with large datasets, ensuring their availability, quality, and performance across systems. Implement data models, optimize storage, and query performance on Snowflake. Write complex SQL queries for data extraction, transformation, and reporting purposes. Develop, test, and implement data solutions leveraging Python scripting and Snowflakes native features. Collaborate with data scientists, analysts, and business stakeholders to deliver scalable data solutions. Monitor and troubleshoot data pipelines to ensure smooth operation and efficiency. Perform data migration, integration, and processing tasks across cloud platforms. Stay updated with the latest developments in Snowflake, SQL, and cloud technologies. Required Skills Snowflake: Expertise in building, optimizing, and managing data warehousing solutions on Snowflake. SQL: Strong knowledge of SQL for querying and managing relational databases, writing complex queries, stored procedures, and performance tuning. Python: Proficiency in Python for scripting, automation, and integration within data pipelines. Experience in developing and managing ETL processes, and ensuring data accuracy and performance. Hands-on experience with data migration and integration processes across cloud platforms. Familiarity with data security and governance best practices. Strong problem-solving skills with the ability to troubleshoot and resolve data-related issues. (ref:hirist.tech),
Posted 1 month ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,
Posted 1 month ago
9.0 - 12.0 years
20 - 35 Lacs
Gurugram
Work from Office
Experience: 9+ years in solution and technical architecture with strong software development background Cloud Expertise: Minimum 7 years of experience in cloud application migration , especially AWS or Azure Hands-on experience with hybrid cloud design , AWS serverless , containers Integration & APIs: Strong understanding of application/data integration , including Mulesoft , Apigee , API Connect , Informatica Familiarity with SOA , REST APIs , microservices , ESB , BPM Technology Stack: Exposure to Node.js , Java frameworks , databases, queues, event processing DevOps & Automation: Working knowledge of CI/CD pipelines , testing, and automation best practices Domain Knowledge: BFSI experience and understanding of Open Architecture is preferred Nice to have: Experience in AI , Data Analytics , and presenting architecture to senior stakeholders Please share profiles of candidates who meet the above requirements and have delivered 2- 3 hybrid projects in a technology organization.
Posted 1 month ago
3.0 - 6.0 years
3 - 6 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities:- Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements.- Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements.- Develop and maintain technical documentation for SnapLogic integrations and workflows.- Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: - Must To Have Skills: Strong experience in SnapLogic.- Good To Have Skills: Experience in other ETL tools like Informatica, Talend, or DataStage.- Experience in designing, developing, and maintaining integrations and workflows using SnapLogic.- Experience in analyzing business requirements and developing solutions to meet those requirements.- Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic.- This position is based at our Pune office. Qualification 15 Years of full time education
Posted 1 month ago
10.0 - 15.0 years
12 - 18 Lacs
Bengaluru
Work from Office
Mode: Contract As an Azure Data Architect, you will: Lead architectural design and migration strategies, especially from Oracle to Azure Data Lake Architect and build end-to-end data pipelines leveraging Databricks, Spark, and Delta Lake Design secure, scalable data solutions integrating ADF, SQL Data Warehouse, and on-prem/cloud systems Optimize cloud resource usage and pipeline performance Set up CI/CD pipelines with Azure DevOps Mentor team members and align architecture with business needs Qualifications: 10-15 years in Data Engineering/Architecture roles Extensive hands-on with: Databricks, Azure Data Factory, Azure SQL Data Warehouse Data integration, migration, cluster configuration, and performance tuning Azure DevOps and cloud monitoring tools Excellent interpersonal and stakeholder management skills.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |