Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
India
Remote
Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
India
Remote
Kindly share your resume lakshmi.b@iclanz.com or hr@iclanz.com Position: Lead Data Engineer - Health Care domain Experience: 7+ Years Location: Hyderabad | Chennai | Remote SUMMARY: Data Engineer will be responsible for ETL and documentation in building data warehouse and analytics capabilities. Additionally, maintain existing systems/processes and develop new features, along with reviewing, presenting and implementing performance improvements. Duties and Responsibilities • Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies • Monitoring active ETL jobs in production. • Build out data lineage artifacts to ensure all current and future systems are properly documented • Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes • Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies • Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations • Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. • Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills • This job has no supervisory responsibilities. • Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work • 5+ years’ experience with a strong proficiency with SQL query/development skills • Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks • Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) • Experience working in the healthcare industry with PHI/PII • Creative, lateral, and critical thinker • Excellent communicator • Well-developed interpersonal skills • Good at prioritizing tasks and time management • Ability to describe, create and implement new solutions • Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) • Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) • Big Data stack (e.g. Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume). Details Required for Submission: Requirement Name: First Name Last Name Email id: Best Number: Current Organization / Previous Organization you Worked (last date): Currently working on a project: Total Experience: Relevant Experience Primary Skills Years of Experience Ratings (out of 10) Data Engineer : ETL : Healthcare (PHI/PII): Fivetran: DBT: LinkedIn profile: Comfortable to work from 03.00 pm to 12.00 am IST? Communication: Education Details – Degree & Passed out year: Notice Period: Vendor Company Name: iClanz Inc expected Salary: Current Location / Preferred Location: Show more Show less
Posted 1 week ago
9.0 - 14.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Senior Programmer Analyst (data engineering senior programmer analyst) is an intermediate level position responsible for participation in the establishment and implementation of new or revised data platform eco systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to data engineering scrum team to implement the business requirements: Responsibilities: Build and maintain batch or real-time data pipelines in data platform. Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources. Develop ETL (extract, transform, load) processes to help extract and manipulate data from multiple sources. Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Automate data workflows such as data ingestion, aggregation, and ETL processing. Prepare raw data in Data Warehouses into a consumable dataset for both technical and non-technical stakeholders. Build, maintain, and deploy data products for analytics and data science teams on data platform Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures. Monitor data systems performance and implement optimization solution. Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Serve as advisor or coach to new or lower level analysts Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 9 to 14 years of relevant experience in Data engineering role Advanced SQL/ RDBMS skills and experience with relational databases and database design. Strong proficiency in object-oriented languages: Python, PySpark is must Experience working with Bigdata - Hive/Impala/S3/HDFS Experience working with data ingestion tools such as Talend or Ab Initio. Nice to working with data lakehouse architecture such as AWS Cloud/Airflow/Starburst/Iceberg Strong proficiency in scripting languages like Bash, UNIX Shell scripting Strong proficiency in data pipeline and workflow management tools Strong project management and organizational skills. Excellent problem-solving, communication, and organizational skills. Proven ability to work independently and with a team. Experience in managing and implementing successful projects Ability to adjust priorities quickly as circumstances dictate Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer – C10/Officer (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
2.0 - 6.0 years
3 - 8 Lacs
Pune, Sangli
Work from Office
We are looking for a Data Science Engineer with strong experience in ETL development and Talend to join our data and analytics team. The ideal candidate will be responsible for designing robust data pipelines, enabling analytics and AI solutions, and working on scalable data science projects that drive business value. Key Responsibilities: Design, build, and maintain ETL pipelines using Talend Data Integration . Extract data from multiple sources (databases, APIs, flat files) and load it into data warehouses or lakes. Ensure data integrity , quality , and performance tuning in ETL workflows. Implement job scheduling, logging, and exception handling using Talend and orchestration tools. Prepare and transform large datasets for analytics and machine learning use cases. Build and deploy data pipelines that feed predictive models and business intelligence platforms. Collaborate with data scientists to operationalize ML models and ensure they run efficiently at scale. Assist in feature engineering , data labeling , and model monitoring processes. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. 3+ years of experience in ETL development , with at least 2 years using Talend . Proficiency in SQL , Python (for data transformation or automation) Hands-on experience with data integration , data modeling , and data warehousing . Must have Strong Knowledge of cloud platforms such as AWS , Azure , or Google Cloud . Familiarity with big data tools like Spark, Hadoop, or Kafka is a plus.
Posted 1 week ago
12.0 - 17.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview We are seeking an experienced and strategic leader to join our Business Intelligence & Reporting organization as Deputy Director BI Governance. This role will lead the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centres. The ideal candidate will bring deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement to ensure a standardized, scalable, and value-driven BI ecosystem across the enterprise. Responsibilities Key Responsibilities Governance Leadership Define and implement the enterprise BI governance strategy, policies, and operating model. Drive consistent governance processes across sectors and global capability centers. Set standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Stakeholder Management Serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Lead governance councils, working groups, and decision forums to drive adoption and compliance. Policy and Compliance Establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implement approval and exception processes for BI development outside the COE. Demand and Intake Governance Lead the governance of BI demand intake and prioritization processes. Ensure transparency and traceability of BI requests and outcomes across business units. Metrics and Continuous Improvement Define KPIs and dashboards to monitor BI governance maturity and compliance. Identify areas for process optimization and lead continuous improvement efforts. Qualifications Experience12+ years in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain ExpertiseStrong understanding of BI platforms (Power BI, Tableau, etc.), data management practices, and governance frameworks Strategic MindsetProven ability to drive change, influence at senior levels, and align governance initiatives with enterprise goals. Operational ExcellenceExperience managing cross-functional governance processes and balancing centralized control with local flexibility. EducationBachelor's degree required; MBA or Masters in Data/Analytics preferred.
Posted 1 week ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a Spark, Big Data - ETL Tech Lead for Commercial Card’s Global Data Repository development team. The successful candidate will interact with the Development Project Manager, the development, testing, and production support teams, as well as other departments within Citigroup (such as the System Administrators, Database Administrators, Data Centre Operations, and Change Control groups) for TTS platforms. He/she requires exceptional communication skills across both technology and the business and will have a high degree of visibility. The candidate will be a rigorous technical lead with a strong understanding of how to build scalable, enterprise level global applications. The ideal candidate will be dependable and resourceful software professional who can comfortably work in a large development team in a globally distributed, dynamic work environment that fosters diversity, teamwork and collaboration. The ability to work in high pressured environment is essential. Responsibilities: Lead the design and implementation of large-scale data processing pipelines using Apache Spark on BigData Hadoop Platform. Develop and optimize Spark applications for performance and scalability. Responsible for providing technical leadership of multiple large scale/complex global software solutions. Integrate data from various sources, including Couchbase, Snowflake, and HBase, ensuring data quality and consistency. Experience of developing teams of permanent employees and vendors from 5 – 15 developers in size Build and sustain strong relationships with the senior business leaders associated with the platform Design, code, test, document and implement application release projects as part of development team. Work with onsite development partners to ensure design and coding best practices. Work closely with Program Management and Quality Control teams to deliver quality software to agreed project schedules. Proactively notify Development Project Manager of risks, bottlenecks, problems, issues, and concerns. Compliance with Citi's System Development Lifecycle and Information Security requirements. Oversee development scope, budgets, time line documents Monitor, update and communicate project timelines and milestones; obtain senior management feedback; understand potential speed bumps and client’s true concerns/needs. Stay updated with the latest trends and technologies in big data and cloud computing. Mentor and guide junior developers, providing technical leadership and expertise. Key Challenges: Managing time and changing priorities in a dynamic environment Ability to provide quick turnaround to software issues and management requests Ability to assimilate key issues and concepts and come up to speed quickly Qualifications: Bachelor’s or master’s degree in computer science, Information Technology, or equivalent Minimum 10 years of Proven experience in developing and managing big data solutions using Apache Spark. Having strong hold on Spark-core, Spark-SQL & Spark Streaming Minimum 6 years of experience in leading globally distributed teams successfully. Strong programming skills in Scala, Java, or Python. Hands on experience on Technologies like Apache Hive, Apache Kafka, HBase, Couchbase, Sqoop, Flume etc. Proficiency in SQL and experience with relational (Oracle/PL-SQL) and NoSQL databases like mongoDB. Demonstrated people and technical management skills. Demonstrated excellent software development skills. Strong experiences in implementation of complex file transformations like positional, xmls. Experience in building enterprise system with focus on recovery, stability, reliability, scalability and performance. Experience in working on Kafka, JMS / MQ applications. Experience in working multiple OS (Unix, Linux, Win) Familiarity with data warehousing concepts and ETL processes. Experience in performance tuning of large technical solutions with significant volumes Knowledge of data modeling, data architecture, and data integration techniques. Knowledge on best practices for data security, privacy, and compliance. Key Competencies: Excellent organization skills, attention to detail, and ability to multi-task Demonstrated sense of responsibility and capability to deliver quickly Excellent communication skills. Clearly articulating and documenting technical and functional specifications is a key requirement. Proactive problem-solver Relationship builder and team player Negotiation, difficult conversation management and prioritization skills Flexibility to handle multiple complex projects and changing priorities Excellent verbal, written and interpersonal communication skills Good analytical and business skills Promotes teamwork and builds strong relationships within and across global teams Promotes continuous process improvement especially in code quality, testability & reliability Desirable Skills: Experience in Java, Spring, ETL Tools like Talend, Ab Initio is a plus. Experience of migrating functionality from ETL tools to Spark. Experience/knowledge on Cloud technologies AWS, GCP. Experience in Financial industry ETL Certification, Project Management Certification Experience with Commercial Cards applications and processes would be advantageous Experience with Agile methodology This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary We are seeking an experienced Integration Developer with hands-on experience to join our development team. In this role you will be responsible for designing developing and maintaining integration solutions that connect various systems and applications within the organization. You will work with a range of technologies to ensure seamless data exchange workflow automation and overall system interoperability. Responsibilities The ideal candidate will have a solid understanding of integration patterns API development and cloud-based solutions and will be able to collaborate effectively with cross-functional teams. Design and develop robust integration solutions using middleware technologies APIs and services to ensure smooth communication between various internal and external systems. Create and maintain data flows transformations and mappings between different platforms databases and applications. Leverage enterprise integration patterns and best practices to implement scalable secure and high-performing integrations. Design and implement RESTful APIs web services and microservices for integrating applications across various platforms. Work with API gateways and manage API lifecycle from design to deployment ensuring security versioning and performance optimization. Integrate third-party APIs and services into existing systems ensuring seamless functionality and data exchange. Ensure data synchronization and integration between disparate systems (CRM ERP HRMS cloud platforms etc.) while maintaining data consistency and quality. Design develop and implement ETL (Extract Transform Load) processes for efficient data migration and integration between systems. Troubleshoot and resolve integration issues ensuring minimal downtime and impact to business operations. Work closely with business analysts project managers and other developers to gather requirements and deliver integration solutions aligned with business needs. Collaborate with infrastructure and cloud teams to design and implement scalable integration solutions leveraging cloud platforms (e.g. AWS Azure GCP). Coordinate with QA teams to ensure thorough testing of integration components ensuring they meet performance security and functional requirements. Contribute to the development of integration best practices and guidelines to ensure consistent high-quality solutions across the organization. Act as a key point of contact for troubleshooting integration issues providing timely resolution and post-mortem analysis for recurring problems. Provide support to operational teams for maintaining the health of integration solutions in production environments. Skills Experience in developing and implementing integration solutions using middleware technologies such as MuleSoft Dell Boomi Apache Camel or IBM Integration Bus (IIB). Strong experience with RESTful APIs SOAP Web Services and microservices. Proficiency in integration technologies like JMS Kafka and RabbitMQ for message-driven integrations. Experience with ETL tools (e.g. Talend Informatica) for data transformation and loading. Strong knowledge of SQL and experience working with relational and NoSQL databases (e.g. MySQL PostgreSQL MongoDB). Familiarity with cloud-based integration solutions (AWS Azure GCP) API management platforms (e.g. Apigee Kong or AWS API Gateway) and containerization technologies (e.g. Docker Kubernetes). Skills in programming languages such as Java Python JavaScript or similar for building integration solutions. Experience with scripting languages for automating integration tasks (e.g. Bash PowerShell Python). Experience in debugging integration problems identifying root causes and implementing corrective actions. Excellent verbal and written communication skills with the ability to document integration solutions processes and guidelines clearly. Good to have Telecom domain knowledge. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Process Manager - GCP Data Engineer Mumbai/Pune | Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)| Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles And Responsibilities Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store, organize, and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical And Functional Skills Bachelor’s Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About EClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About About eClerx Technology eClerx’s Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists’ partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Sapiens is on the lookout for a Senior Developer to become a key player in our Bangalore team. If you're a seasoned Senior Developer pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens’ Digital (Data Suite) division, for more information about it, click here: https://sapiens.com/solutions/digitalsuite-customer-experience-and-engagement-software-for-insurers/ Job Description What you’ll do: Collaborate with business users to understand and refine ETL requirements and business rules for effective solution implementation. Design, develop, implement, and optimize ETL processes to meet business and technical needs. Troubleshoot and resolve ETL-related issues, ensuring system performance and reliability. Create and execute comprehensive unit test plans based on system and validation requirements to ensure the quality of the solutions. Provide ongoing support and consultation for the development and enhancement of technical solutions across various business functions. Primary Skills What to Have for this position: Strong understanding of advanced ETL concepts, as well as the administration activities required to support R&D and project needs. Extensive experience with ETL tools and advanced transformations, particularly Talend and Java. Ability to effectively troubleshoot and resolve complex ETL coding and administrative issues. Secondary Skills Experience in designing and developing fully interactive dashboards, including storylines, drill-down functionality, and linked visualizations. Ability to design and optimize tables, views, and DataMarts to support dynamic and efficient dashboards. Proficient in proposing and implementing data load strategies that enhance performance and improve data visualizations. Expertise in performance tuning for SQL, ETL processes and reports. Process Knowledge Experience in data validation and working with cross-functional teams (including Business Analysts and Business Users) to clarify and define business requirements. Ability to develop ETL mappings, specifications (LLDs/HLDs), and data load strategies with minimal supervision. Understanding of SDLC methodologies, including Agile, and familiarity with tools such as JIRA for project management and issue tracking. Sapiens is an equal opportunity employer. We value diversity and strive to create an inclusive work environment that embraces individuals from diverse backgrounds. Disclaimer: Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary We are seeking a skilled and proactive Data Engineer with a strong background in ETL development and a focus on integrating data quality frameworks. In this role, you will be responsible for designing, developing, and maintaining ETL pipelines while ensuring data quality is embedded throughout the process. You will play a crucial role in building robust and reliable data pipelines that deliver high-quality data to our data warehouse and other systems. Responsibilities Design, develop, and implement ETL processes to extract data from various source systems, transform it according to business requirements, and load it into target systems (e.g., data warehouse, data lake) Implement data validation and error handling within ETL pipelines. Build and maintain scalable, reliable, and efficient data pipelines. Design and implement data quality checks, validations, and transformations within ETL processes. Automate data quality monitoring, alerting, and reporting within ETL pipelines. Develop and implement data quality rules and standards within ETL processes. Integrate data from diverse sources, including databases, APIs, flat files, and cloud-based systems. Utilize ETL tools and technologies (e.g., SnapLogic, Informatica PowerCenter, Talend, AWS Glue, Apache Airflow, Azure Data Factory, etc.). Write SQL queries to extract, transform, load, and validate data. Use scripting languages (e.g., Python) to automate ETL processes, data quality checks, and data transformations. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less
Posted 1 week ago
12.0 years
4 - 6 Lacs
Hyderābād
On-site
Job Description About the job Who We Are Looking For The ideal candidate will have a deep techno-functional background working with major fund management and/or banking software applications, preferably within a front office environment. They will also have a deep understanding front-office terminology and operations, proven leadership & managerial experience of leading teams and liaising with cross functional groups. This is a senior role that will have responsibility for managing and expanding the team based in India. The SaaS Support Manager will manage teams of Production Support Analysts at a variety of levels while holding them accountable for the successful monitoring, troubleshooting and resolution of the issues faced by our clients. In addition, the Manager will work with multiple cross functional teams to take ownership and accountability for the delivery of the team Why this role is important to us The team you will be joining is a part of Charles River Development (or CRD), that became a part of State Street in 2018. CRD helps create enterprise investment management software solutions for large institutions in the areas of institutional investment, wealth management and hedge funds. Together we have created the first open front-to-back platform - State Street Alpha, that was launched in 2019. Join us if delivering front office software solutions, next generation infrastructure, using emerging technologies like AI sounds like a challenge you are up for. What You Will Be Responsible For Manage an existing, growing SaaS Team supporting and enhancing our clients’ operational use of the Charles River Investment Management Systems and Alpha Data Platform. The role requires strong management, communication, and leadership skills as well as “hands-on” client issue management, production support, product enhancements, client relations, and staff hiring/development. Strategic Responsibilities: Identify opportunities to improve the service and delivery model and work with staff, peers, and management to improve and enhance the services. Act as an Escalation point for Production Support Analysts on client or internal operational issues Support and enhance Clients’ ROI from CRIMS and ADP Support and facilitate Charles River’s business objectives Operations Escalate issues to Senior Management using sound judgement Troubleshooting and issue resolution Hands on management of key clients Monitoring of queue to ensure accuracy, completeness, timely, and proper progress Mentoring junior team members through issues and major processes Oversight of issue resolution and support for team members Daily operational monitoring, support and resolution for regional Clients as well as other Clients as part of the global team and follow-the-sun support model Accountable for ensuring defined processes are followed Identification of issues in existing processes Occasional off hours / weekend support Service Partner communications and coordination Clear and accountable reporting to Senior Management Manage team to provide reliable mission critical software based operations and support Provide day to day performance management and support for subordinates Work with CRD colleagues to smoothly transition new Clients to production operation through the onboarding process. Work with vendors as partners to deliver a tightly integrated, seamless service to Charles River Clients. Technical Responsibilities Hands on capability to undertake basic investigative activities on RDBMS such as writing and running queries and data investigations, basic performance diagnosis, basic administration functions Familiarity with common technology platforms and the ability to guide and check technical approaches and issues resolution. Platforms used include RDBMS, C#, Java, FIX, Perl, ETL, Talend, PL/Transact SQL, Web Admin and Web Services. Facilitate and manage Issue diagnosis and resolution Oversee technology management and issue resolution Skills, Experience & Qualifications Required: Track record of leadership in a high-growth, mission critical software based operations and support environment. Ability to manage staff through multiple projects and tasks with changing deadlines. Excellent Customer Service skills, expectation management Project and staff management and mentoring skills Strong interpersonal, verbal, and written communication skills. Partner liaison (internal and external) Bachelor’s degree 12+ years’ experience managing and supporting mission critical production software applications. 5+ years’ experience in a management/leadership role. 10+ years database software or IT technical experience. Experience with any of the following software is desired, in order of preference: 1) Charles River IMS, Enterprise Data Mgmt Systems 2) Non-Charles River Trade Order Management software, 3) Middle or Back Office trading applications, or 4) Financial applications. Experience or familiarity with servers and networks is preferred. Experience with the following is preferred: databases, server operating systems, servers, networks, job scheduling software, system monitoring software About State Street What we do. State Street is one of the largest custodian banks, asset managers and asset intelligence companies in the world. From technology to product innovation we’re making our mark on the financial services industry. For more than two centuries, we’ve been helping our clients safeguard and steward the investments of millions of people. We provide investment servicing, data & analytics, investment research & trading and investment management to institutional clients. Work, Live and Grow. We make all efforts to create a great work environment. Our benefits packages are competitive and comprehensive. Details vary in locations, but you may expect generous medical care, insurance and savings plans among other perks. You’ll have access to flexible Work Program to help you match your needs. And our wealth of development programs and educational support will help you reach your full potential. Inclusion, Diversity and Social Responsibility. We truly believe our employees’ diverse backgrounds, experiences and perspective are a powerful contributor to creating an inclusive environment where everyone can thrive and reach their maximum potential while adding value to both our organization and our clients. We warmly welcome the candidates of diverse origin, background, ability, age, sexual orientation, gender identity and personality. Another fundamental value at State Street is active engagement with our communities around the world, both as a partner and a leader. You will have tools to help balance your professional and personal life, paid volunteer days, matching gift program and access to employee networks that help you stay connected to what matters to you. State Street is an equal opportunity and affirmative action employer. Discover more at StateStreet.com/careers
Posted 1 week ago
5.0 - 8.0 years
6 - 9 Lacs
Hyderābād
On-site
Engineer, Software Engineering Hyderabad, India Information Technology 311642 Job Description About The Role: Grade Level (for internal use): 09 The Team: We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact: You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect: An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities: We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests. Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code. Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code. Collaborate effectively with technical and non-technical stakeholders. Respond to and resolve production issues. What we are looking for: Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools. Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required technical skills: Build data pipelines. Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow. Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData. Write infrastructure as code to develop sandbox environments. Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed. Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable technical skills: Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India
Posted 1 week ago
7.0 years
0 Lacs
Hyderābād
Remote
Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Microsoft PowerBI Position: LA Experience:7+Years Category: Software Development/ Engineering Main location: Hyderabad Position ID: J0425-0317 Employment Type: Full Time Job Description : Microsoft PowerBI This role involves migrating existing Tableau dashboards and datasets to Power BI. The individual will be responsible for the technical design, development, testing, and deployment of Power BI solutions. They will work closely with stakeholders to ensure a smooth transition and maintain data accuracy and consistency throughout the migration process. Key Responsibilities: Prepare detailed technical specifications documents for each dashboard. Convert Tableau datasets to Power BI datasets, including data modeling and optimization. Develop visuals in Power BI and apply filters. Test for data accuracy against existing Tableau reports. Develop navigation within dashboards and ensure a uniform UI experience across sheets. Publish and test dashboards on the Power BI Service. Provide UAT (User Acceptance Testing) support. Update documentation as needed. Manage the release process, including moving to production. Provide post-production support and hypercare. Collaborate with Talend and BQ support teams. Participate in project management activities, including status reporting and daily stand-ups. Required Skills and Experience: 7+ years of experience and 5+ relevant in Power BI Strong experience with Power BI development and data modeling. Familiarity with Tableau and its functionalities. Understanding of data warehousing concepts and STAR schema modeling. Experience with data validation and testing. Ability to create technical documentation. Excellent communication and collaboration skills. Behavioural Competencies : Proven experience of delivering process efficiencies and improvements Clear and fluent English (both verbal and written) Ability to build and maintain efficient working relationships with remote teams Demonstrate ability to take ownership of and accountability for relevant products and services Ability to plan, prioritise and complete your own work, whilst remaining a team player Willingness to engage with and work in other technologies Note: This job description is a general outline of the responsibilities and qualifications typically associated with the Power BI role. Actual duties and qualifications may vary based on the specific needs of the organization. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodations for people with disabilities in accordance with provincial legislation. Please let us know if you require a reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Your future duties and responsibilities Required qualifications to be successful in this role Your future duties and responsibilities Required qualifications to be successful in this role Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
5.0 years
5 - 10 Lacs
Bengaluru
On-site
Country/Region: IN Requisition ID: 26145 Work Model: Position Type: Salary Range: Location: INDIA - BENGALURU - BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility o Job Title – Denodo Developer o No. of Open Positions - 1 o Experience- 5- 9 years o Location: Bangalore, Noida, Chennai, Mumba, Hyderabad, Pune o Shift Time - CET (12:30 to 9:30 IST) Job Description: We are seeking a highly skilled and experienced Denodo Developer with a strong background in ETL processes and deep knowledge of the Life Sciences domain. The ideal candidate will be responsible for developing data virtualization solutions, integrating complex datasets from multiple sources, and enabling real-time data access for analytics and operational reporting. This role requires close collaboration with data architects, data engineers, and business stakeholders in a regulated environment. Key Proficiency & Responsibilities: Design, develop, and optimize data virtualization solutions using Denodo Platform. Integrate structured and unstructured data sources into Denodo views and services. Develop custom views, VQL scripts, and data services (REST/SOAP). Build and optimize ETL/ELT pipelines to support data ingestion and transformation. Work closely with Life Sciences business teams to translate domain-specific requirements into data solutions. Implement data governance, security, and compliance practices adhering to GxP and FDA regulations. Provide support for data access, lineage, metadata management, and user training. Collaborate with cross-functional teams in an Agile development environment. Optimize workflows for performance and scalability. Develop and maintain data documentation, including workflow descriptions and data dictionaries. Strong knowledge of data preparation, ETL concepts, and data warehousing. Excellent analytical, problem-solving, and communication skills. Proficient in VQL, JDBC, ODBC, and web services integration. Strong expertise in ETL tools (e.g., Informatica, Talend, DataStage, or Azure Data Factory). Deep understanding of Life Sciences domain – clinical trials, regulatory data, pharmacovigilance, or research & development. Preferred Qualifications: B.Tech. or MCA from a recognized University Minimum 5+ years of relevant experience as a Denodo Developer. Strong SQL and database skills (Oracle, SQL Server, PostgreSQL, etc.). Knowledge of data modelling, data warehousing, and virtual data layers. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience working in Agile/Scrum environments. Exposure to cloud platforms such as AWS, Azure, or GCP.
Posted 1 week ago
5.0 years
28 - 32 Lacs
Belgaum
On-site
Company Description BETSOL is a cloud-first digital transformation and data management company offering products and IT services to enterprises in over 40 countries. BETSOL team holds several engineering patents, is recognized with industry awards, and BETSOL maintains a net promoter score that is 2x the industry average. BETSOL’s open source backup and recovery product line, Zmanda (Zmanda.com), delivers up to 50% savings in total cost of ownership (TCO) and best-in-class performance. BETSOL Global IT Services (BETSOL.com) builds and supports end-to-end enterprise solutions, reducing time-to-market for its customers. BETSOL offices are set against the vibrant backdrops of Broomfield, Colorado and Bangalore, India. We take pride in being an employee-centric organization, offering comprehensive health insurance, competitive salaries, 401K, volunteer programs, and scholarship opportunities. Office amenities include a fitness center, cafe, and recreational facilities. Learn more at betsol.com Job Description Position Overview We are seeking a highly skilled and experienced Data Architect with expertise in cloud-based solutions. The ideal candidate will design, implement, and optimize our data architecture to meet the organization's current and future needs. This role requires a strong background in data modeling, transformation, and governance, along with hands-on experience with modern cloud platforms and tools such as Snowflake, Spark, Data Lakes, and Data Warehouses. The successful candidate will also establish and enforce standards and guidelines across data platforms to ensure consistency, scalability, and best practices. Exceptional communication skills are essential to collaborate across cross-functional teams and stakeholders. Key Responsibilities Design and Implementation: Architect and implement scalable, secure, and high-performance cloud data platforms, integrating data lakes, data warehouses, and databases. Develop comprehensive data models to support analytics, reporting, and operational needs. Data Integration and Transformation: Lead the design and execution of ETL/ELT pipelines using tools like, Talend / Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark to process and transform data efficiently. Integrate diverse data sources into cohesive and reusable datasets for business intelligence and machine learning purposes. Standards and Guidelines: Establish, document, and enforce standards and guidelines for data architecture, Data modeling, transformation, and governance across all data platforms. Ensure consistency and best practices in data storage, integration, and security throughout the organization. Data Governance: Establish and enforce data governance standards, ensuring data quality, security, and compliance with regulatory requirements. Implement processes and tools to manage metadata, lineage, and data access controls. Cloud Expertise: Utilize Snowflake for advanced analytics and data storage needs, ensuring optimized performance and cost efficiency. Leverage modern cloud platforms to manage data lakes and ensure seamless integration with other services. Collaboration and Communication: Partner with business stakeholders, data engineers, and analysts to gather requirements and translate them into technical designs. Clearly communicate architectural decisions, trade-offs, and progress to both technical and non-technical audiences. Continuous Improvement: Stay updated on emerging trends in cloud and data technologies, recommending innovations to enhance the organization’s data capabilities. Optimize existing architectures to improve scalability, performance, and maintainability. Qualifications Technical Skills: Strong expertise in data modeling (conceptual, logical, physical) and data architecture design principles. Proficiency in Talend / Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark, Snowflake, and cloud-based data platforms. Experience with data lakes, data warehouses, and relational and NoSQL databases. Experience with relational(PGSQL/Oracle) / NoSQL(Couchbase/Cassandra) databases Solid understanding of data transformation techniques and ETL/ELT pipelines. Proficiency in DevOps / DataOps / MLOps tools. Standards and Governance: Experience establishing and enforcing data platform standards, guidelines, and governance frameworks. Proven ability to align data practices with business goals and regulatory compliance. Communication: Exceptional written and verbal communication skills to interact effectively with technical teams and business stakeholders. Experience: 5+ years of experience in data architecture, with a focus on cloud technologies. Proven track record of delivering scalable, cloud-based data solutions. Education: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Preferred Qualifications Certification in Snowflake, AWS data services, Any RDBMS / NoSQL, AI/ML, Data Governance. Familiarity with machine learning workflows and data pipelines. Experience working in Agile development environments. Job Type: Full-time Pay: ₹2,852,815.46 - ₹3,289,062.57 per year Benefits: Health insurance Schedule: Day shift Work Location: In person
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Summary - We are seeking a detail-oriented and analytical Data Analyst to join our team. The ideal incumbent will be responsible for collecting, processing, and analyzing large datasets to uncover insights that drive strategic decision-making. You will work closely with cross-functional teams to identify trends, create visualizations, and deliver actionable recommendations that support business goals. Key Responsibilities. Drive business excellence by identifying opportunities for process optimization, automation, and standardization through data insights. Design, develop, and maintain robust ETL pipelines and SQL queries to ingest, transform, and load data from diverse sources. Build and maintain Excel-based dashboards, models, and reports; automate repetitive tasks using Excel macros, Power Query, or scripting tools. Ensure data quality, integrity, and consistency through profiling, cleansing, validation, and regular monitoring. Translate business questions into analytical problems and deliver actionable insights using statistical techniques and data visualization tools. Collaborate with cross-functional teams (e.g., marketing, finance, operations) to define data requirements and address business challenges. Develop and implement efficient data collection strategies and systems to optimize accuracy and performance. Monitor and troubleshoot data workflows, resolving issues and ensuring compliance with data privacy and security regulations. Document data processes, definitions, and business rules to support transparency, reuse, and continuous improvement. Support continuous improvement initiatives by providing data-driven recommendations that enhance operational efficiency and decision-making. Contribute to the development and implementation of best practices in data management, reporting, and analytics aligned with business goals. Person Profile . Qualification - Bachelor’s / Master’s degree in Computer Science, Information Systems, Statistics, or a related field. Experience- 2 -5 Yrs. Desired Certification & Must Have- 3–5 years of experience in data analysis, preferably in the pharmaceutical industry. Advanced proficiency in SQL (joins, CTEs, window functions, optimization) and expert-level Excel skills (pivot tables, advanced formulas, VBA/macros). Strong understanding of data warehousing, relational databases, and ETL tools (e.g., SSIS, Talend, Informatica). Proficiency in data visualization tools (e.g., Power BI, Tableau) and statistical analysis techniques. Solid analytical and problem-solving skills with attention to detail and the ability to manage complex data sets and multiple priorities. Excellent communication and documentation skills to convey insights to technical and non-technical stakeholders. Familiarity with data modelling, database management, and large-scale data manipulation and cleansing. Demonstrated ability to work collaboratively in Agile/Scrum environments and adapt to evolving business needs. Strong focus on process optimization, continuous improvement, and operational efficiency. Experience in implementing best practices for data governance, quality assurance, and compliance. Ability to identify and drive initiatives that enhance business performance through data-driven decision-making. Exposure to business domains such as finance, operations, or marketing analytics with a strategic mindset Show more Show less
Posted 1 week ago
3.0 - 7.0 years
0 - 1 Lacs
Pune, Ahmedabad, Bengaluru
Work from Office
Job Title: Reltio MDM Developer Location: Remote Experience Required: 2+ Years Key Responsibilities: Design, configure, and implement Reltio MDM solutions based on business and technical requirements. Develop and enhance Reltio data models including entities, attributes, relationships, and match/merge rules. Configure survivorship rules , reference data, workflows, and validation rules within the platform. Build seamless integrations between Reltio and external systems using REST APIs , ETL tools (e.g., Informatica, Talend), or middleware solutions (e.g., MuleSoft). Monitor, troubleshoot, and optimize data load and synchronization processes. Support data governance initiatives, including data quality profiling , standardization, and issue resolution. Collaborate with business stakeholders, data stewards, and analysts to refine requirements and address data integrity concerns. Ensure performance tuning and adherence to Reltio best practices for configuration and deployment. Required Skills: Minimum 2+ years of hands-on experience working with the Reltio Cloud MDM platform . Strong grasp of MDM principles , data modeling concepts , and entity relationship management . Experience configuring Reltio L3 , match/merge logic, and survivorship strategies. Proficiency with REST APIs , JSON , and XML for integration and data exchange. Working experience with integration tools like Talend , Informatica , or MuleSoft . Solid debugging and troubleshooting skills related to data quality , transformations, and API communication. Familiarity with data governance frameworks and compliance standards. Nice to Have: Experience in implementing Reltio UI configurations or custom UI components. Exposure to data analytics and reporting tools. Knowledge of cloud platforms (e.g., AWS, Azure) for hosting or extending MDM functionality. Familiarity with Agile methodologies and tools like JIRA or Confluence .
Posted 1 week ago
30.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Position Overview ABOUT APOLLO Apollo is a high-growth, global alternative asset manager. In our asset management business, we seek to provide our clients excess return at every point along the risk-reward spectrum from investment grade to private equity with a focus on three investing strategies: yield, hybrid, and equity. For more than three decades, our investing expertise across our fully integrated platform has served the financial return needs of our clients and provided businesses with innovative capital solutions for growth. Through Athene, our retirement services business, we specialize in helping clients achieve financial security by providing a suite of retirement savings products and acting as a solutions provider to institutions. Our patient, creative, and knowledgeable approach to investing aligns our clients, businesses we invest in, our employees, and the communities we impact, to expand opportunity and achieve positive outcomes. OUR PURPOSE AND CORE VALUES Our Clients Rely On Our Investment Acumen To Help Secure Their Future. We Must Never Lose Our Focus And Determination To Be The Best Investors And Most Trusted Partners On Their Behalf. We Strive To Be The leading provider of retirement income solutions to institutions, companies, and individuals. The leading provider of capital solutions to companies. Our breadth and scale enable us to deliver capital for even the largest projects – and our small firm mindset ensures we will be a thoughtful and dedicated partner to these organizations. We are committed to helping them build stronger businesses. A leading contributor to addressing some of the biggest issues facing the world today – such as energy transition, accelerating the adoption of new technologies, and social impact – where innovative approaches to investing can make a positive difference. We are building a unique firm of extraordinary colleagues who: Outperform expectations Challenge Convention Champion Opportunity Lead responsibly Drive collaboration As One Apollo team, we believe that doing great work and having fun go hand in hand, and we are proud of what we can achieve together. Our Benefits Apollo relies on its people to keep it a leader in alternative investment management, and the firm’s benefit programs are crafted to offer meaningful coverage for both you and your family. Please reach out to your Human Capital Business Partner for more detailed information on specific benefits. Position Overview At Apollo, we’re a global team of alternative investment managers passionate about delivering uncommon value to our investors and shareholders. With over 30 years of proven expertise across Private Equity, Credit and Real Estate, regions and industries, we’re known for our integrated businesses, our strong investment performance, our value-oriented philosophy – and our people. The Client and Innovation Engineering team is responsible to design and deliver digital products to our institutional and wealth management clients and sales team. We are a product driven, and developer focused team; our goal to simplify our engineering process and meet our business objectives. We look for creative collaborators who evolve, adapt to change, and thrive in a fast-paced global environment. Primary Responsibilities Apollo is seeking a hands-on, business-oriented Lead Data Engineer to lead the technology efforts focused on supporting data driven distribution processes. The ideal candidate will bring strong experience in data engineering within asset and/or wealth management, combined with excellent technical acumen and a passion for building scalable, secure, and high-performance data solutions. This role will partner closely with Distribution Data Enablement, Sales & Marketing, Operations, and Finance teams to execute key initiatives aligned with Apollo’s target operating model. You will play a critical role in building and evolving our data products and infrastructure. You will learn new technologies, working on constantly upgrading your skill set and the products you work on to be at-par with the best in the industry. You will innovate and solve technical challenges that emphasizes a long-term vision. Design, build, and maintain scalable and efficient cloud-based data pipelines and integration workflows using Azure Data Factory (ADF), DBT, Snowflake, FiveTran, and related tools. Collaborate closely with business stakeholders to understand data needs and translate them into effective technical solutions, including developing relational and dimensional data models. Implement and optimize end-to-end ETL/ELT processes to support enterprise data needs. Design and implement pipeline controls, conduct data quality assessments and enforce data governance best practices to ensure accuracy and integrity. Monitor, troubleshoot, and resolve issues across data pipelines to ensure stability, reliability, and performance. Partner with cross-functional teams to support analytics, reporting, and operational data needs. Stay current with industry trends and emerging technologies to continuously improve our data architecture. Support master data management (MDM) initiatives and contribute to overall data strategy and architecture. Qualifications & Experience 8+ years of professional experience in data engineering or a related field, ideally within financial services or asset/wealth management. Proven expertise in Azure-based data engineering tools including ADF, DBT, Snowflake, and FiveTran. Programming skills in Python (or Scala/Java) for data transformation and automation. Solid understanding of modern data modeling (relational, dimensional, and star schema). Experience with MDM platforms and frameworks is highly desirable. Familiarity with additional ETL/ELT tools (e.g., Talend, Informatica, SSIS) is a plus. Comfortable working in a fast-paced, agile environment with rapidly changing priorities. Strong communication skills, with the ability to translate complex technical topics into business-friendly language. A degree in Computer Science, Engineering, or a related field is preferred. A strong analytical mindset with a passion for solving complex problems. A team player who is proactive, accountable, and detail oriented. A leader who sets high standards and delivers high-quality outcomes. An innovator who keeps up with industry trends and continually seeks opportunities to improve. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Andhra Pradesh, India
On-site
Key Responsibilities Set up and maintain monitoring dashboards for ETL jobs using Datadog, including metrics, logs, and alerts. Monitor daily ETL workflows and proactively detect and resolve data pipeline failures or performance issues. Create Datadog Monitors for job status (success/failure), job duration, resource utilization, and error trends. Work closely with Data Engineering teams to onboard new pipelines and ensure observability best practices. Integrate Datadog with tools. Conduct root cause analysis of ETL failures and performance bottlenecks. Tune thresholds, baselines, and anomaly detection settings in Datadog to reduce false positives. Document incident handling procedures and contribute to improving overall ETL monitoring maturity. Participate in on call rotations or scheduled support windows to manage ETL health. Required Skills & Qualifications 3+ years of experience in ETL/data pipeline monitoring, preferably in a cloud or hybrid environment. Proficiency in using Datadog for metrics, logging, alerting, and dashboards. Strong understanding of ETL concepts and tools (e.g., Airflow, Informatica, Talend, AWS Glue, or dbt). Familiarity with SQL and querying large datasets. Experience working with Python, Shell scripting, or Bash for automation and log parsing. Understanding of cloud platforms (AWS/GCP/Azure) and services like S3, Redshift, BigQuery, etc. Knowledge of CI/CD and DevOps principles related to data infrastructure monitoring. Preferred Qualifications Experience with distributed tracing and APM in Datadog. Prior experience monitoring Spark, Kafka, or streaming pipelines. Familiarity with ticketing tools (e.g., Jira, ServiceNow) and incident management workflows. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Work in close partnership with the business leadership team to execute the analytics agenda Identify and incubate best-in-class external partners to drive delivery on strategic projects Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to senior leaders Technical experience in roles in best-in-class analytics practices Experience deploying new analytical approaches in a complex and highly matrixed organization Savvy in usage of the analytics techniques to create business impacts Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It Possible As the Data & Analytics Manager for Consumer tower, you will be involved in driving the Data & Analytics strategic vision & roadmap, building momentum by rallying the rest of the organization, implementing data & analytics identified priorities to deliver strong business value across all levels of organization at right cost structure and lead the development of a cutting-edge solution that aims to bring BI and AI to the marketing space. How You Will Contribute You will: Be responsible for analytics engagement with the Marketing function at MDLZ and support the Consumer D&A Lead in driving the D&A agenda and roadmap for the market by collaborating closely with senior business stakeholders to deliver strong business value across functions in the organization. Consult, influence, and collaborate with business stakeholders to craft analytics methodologies and solutions relevant to their needs and use-cases - applying techno-functional expertise in data and AI as part of the solution-forming process. Oversee the day-to-day technical development of the solution, creating strong alignment in the working team to achieve goals. Collaborate across Business and MDS functions to build and develop demands by maximizing our resources of D&A across Data Management, Analytics Products and Data Science– this requires strong collaboration and influencing skills to drive adoption, relevancy, and business impact with speed. Minimizing complexity, establishing right ways of working to accelerate path to value by being choiceful and creative as well as having a growth mindset everyday will be essential. Validate the weekly progress of the technical teams and lead business user tests, ensuring product quality before the product gets into the hands of the internal customer. Creating value from business-driven data and analytics initiatives at scale. An important task for the Data & Analytics Manager is to support the business and other stakeholders in their solving their business problems via relevant data & analytics. This means they will support the business during various stages. In the inspiration phase they help the business in identifying the right use cases and support prioritization based on feasibility and value. In the ideation phase they help with the development of a minimum viable product and business case. During the implementation phase they make sure the service or product is adopted (by the employees), embedded in the workflow (process) and measured for impact. Helps uncover and translate business requirements and stakeholder needs. This translation needs to be done in such a way that the technical specialists in the D&A team can understand (Data Management, Analytics Products, Partners and / or Data Science resources). This requires an understanding of both the business objectives, goals, and domain expertise, as well as data, analytics and technology concepts, methods, and techniques. It also requires strong soft skills with a focus on communication. The role will lead analytics delivery for Marketing initiatives & BI development (reports, dashboard, visualization) and/or data governance (stewardship best practices). Data & Analytics Skills Must have a good understanding of the concepts, methods and techniques used: Analytics, for example diagnostic, descriptive, predictive and prescriptive. AI, for example machine learning, natural language processing. Data management, for example data integration (ETL) or metadata. Data architecture, for example the difference between data warehouse, data lake or data hub. Data modelling, for creation of right reusable data assets. Data governance, for example MDM, data quality and data stewardship practices. Statistical skills, for example understanding the difference between correlation and causation. Technology Skills Good understanding of the tools and technologies in the D&A team: Programming languages like SQL, Python or R and notebooks like R Studio or Jupyter. Data integration tools like Informatica or Talend. Analytics and Business Intelligence tools like Microsoft Power BI or Tableau. S oft skills Leadership with high level of self-initiative and drive, for example leading the discussions on D&A agenda in the BU and building a combined vision across multiple stakeholders. Communication, for example conveying information to diverse audiences in a way that is easily understood and actionable. Facilitation and conflict resolution, for example hosting sessions to elicit ideas from others, understand their issues and encourage group participation. Creative thinking and being comfortable with unknown or unchartered territories, for example framing new concepts for business teams and brainstorming with business users about future product and services. Teamwork, for example working with both business domain teams as well as D&A teams and MDS stakeholders. Collaboration, for example fostering group problem solving and solution creation with business and technical team members. Relationship management, for example creating relationships and builds trust with internal and external stakeholders quickly. Storytelling, for example by creating a consistent, clear storyline for better understanding. Influencing, for example by asserting ideas and persuading others to gain support across an organization or to adopt new behaviors. Domain Skills Must have a good understanding of the business process and associated data: Business Acumen, for example understanding business concepts, practices and business domain language to engage in problem solving sessions and discuss business issues in stakeholder language. Relevant experience in Data and Analytics CPG or FMCG is preferred. Business Process Transformation, for example ability to understand how D&A can help redesign the way work is done. Business data, Nielsen/Circana or other EPOS/Retail Sales data source; Kantar/GFK or other household panel source. Other Skills Agility, Growth mindset will be crucial. Project management capabilities including ability to manage risks, for example understanding of project management concepts to organize their own work and the ability to collaborate with project managers to align business expectations with the D&A team delivery capabilities. Vendor negotiation and effort estimation skills, for example to manage the right partner skills at right cost based on the complexity and importance of the initiatives to be delivered or supported. Business case development, for example to help develop support for the experimenting selected use cases or measure the impact/business value created. For this the role can collaborate with business analysts in sales, marketing, RGM, finance or supply chain. Decision modelling, for example, supports decision makers and improves complicated decisions that involve trade-offs among alternative courses of action by using decision-problem models. UX/design, for example by creating products and visualizations that are easy to work with and support the activities required by the end users. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 1 week ago
8.0 - 10.0 years
30 - 40 Lacs
Noida
Work from Office
Role & responsibilities Collaborate with customers' Business and IT teams to define and gather solution requirements for custom development, B2B/ETL/EAI, and cloud integration initiatives using the Adeptia Integration Platform. Analyze, interpret, and translate customer business needs into scalable and maintainable technical solution designs, aligned with best practices and the capabilities of the Adeptia platform. Storyboard and present solutions to customers and prospects, ensuring a clear understanding of proposed designs and technical workflows. Provide end-to-end project leadership, including planning, tracking deliverables, and coordinating efforts with offshore development teams as required. Review implementation designs and provide architectural guidance and best practices to the implementation team to ensure high-quality execution. Actively assist and mentor customers in configuring and implementing the Adeptia platform, ensuring alignment with technical and business objectives. Solutions Lead (Implementation Services Team) Full Time (Permanent) Noida, India Offer expert recommendations on design and configuration to ensure successful deployment and long-term maintainability of customer solutions. Define clear project requirements, create work breakdown structures, and establish realistic delivery timelines. Delegate tasks effectively, and manage progress against daily, weekly, and monthly targets, ensuring the team remains focused and productive. Serve as a liaison among customers, internal stakeholders, and offshore teams to maintain alignment, track progress, and ensure delivery meets both quality and timeline expectations. Monitor project baselines, identify and mitigate risks, and lead participation in all Agile ceremonies, including sprint grooming, planning, reviews, and retrospectives. Maintain a hands-on technical role, contributing to development activities and conducting detailed code reviews to ensure technical soundness and optimal performance. Take full ownership of assigned projects, driving them to successful, on-time delivery with high quality standards. Preferred candidate profile Technical Proven experience in designing and developing integration solutions involving Cloud/SaaS applications, APIs, SDKs, and legacy systems. Skilled in implementing SOA/EAI principles and integration patterns in B2B, ETL, EAI, and Cloud Integration using platforms such as Adeptia, Talend, MuleSoft or similar tools. Good hands-on experience with Core Java (version 8+) and widely-used Java frameworks including Spring (version 6+), Hibernate (version 6+). Proficient in SOA, RESTful and SOAP web services and related technologies including JMS, SAAJ, JAXP, and XML technologies (XSD, XPath, XSLT, parsing). Strong command over SQL and RDBMS (e.g., Oracle, MySQL, PostgreSQL). Solid understanding of Enterprise Service Bus (ESB) concepts and messaging technologies such as Kafka and RabbitMQ. Familiar with transport protocols including HTTPS, Secure FTP, POP/IMAP/SMTP, and JDBC. Skilled in working with Windows and Linux operating systems, and experienced with application servers such as JBoss, Jetty, and Tomcat. Solid understanding of security best practices, including authentication, authorization, data encryption, and compliance frameworks relevant to enterprise integrations. Basic understanding of modern JavaScript frameworks such as React, with the ability to collaborate effectively on front-end and full-stack development scenarios Non-Technical Strong communication and interpersonal skills with over 5 years of direct client-facing experience. Adept at gathering, clarifying, and understanding business requirements and 2 translating them into actionable technical specifications. Skilled in aligning cross-functional teams to ensure effective execution and stakeholder satisfaction. Over 4 years of hands-on experience with Agile methodologies, including participation in Daily Standups, Product Backlog Refinement, Sprint Planning, Retrospectives, Storyboarding, and User Story writing. Proven ability to deliver high-quality results in fast- paced and iterative environments. More than 4 years of experience as a Team Lead, currently managing a team. Responsible for task delegation, delivery tracking, performance feedback, and fostering team collaboration across time zones. Experienced in creating comprehensive Requirement Documents, Architecture Design Documents, and both High-Level and Low-Level Design specifications to support technical teams and business stakeholders. Demonstrated success in remote/distributed client environments, delivering under tight timelines and high-pressure scenarios. Known for a proactive mindset and solution-oriented approach to project challenges. Highly disciplined and committed to maintaining quality standards, with a focus on continuous testing, adherence to best practices, and application of standard design patterns. Exceptional attention to detail, paired with strong analytical, troubleshooting, and problem- solving skills that contribute to robust and reliable solution delivery. Fluent in English with the ability to clearly articulate thoughts, present ideas, and create well- structured documentation and visual design artifacts. Experienced in creating work schedules, providing performance feedback, and mentoring junior team members. Skilled in training new resources and managing resource allocation to optimize team productivity and performance. Good to have skills: Technical Familiarity with widely used integration standards such as EDI X12, EDIFACT, ebXML, iDOC, BAPI, etc., for B2B and enterprise system communication. Experience in implementing security features including API security, authentication and authorization mechanisms, and message and transport-level encryption. Understanding and experience with various Adeptia security models, including Native, LDAP, SAML, OAuth, IDP, Multiple IDP, Kerberos, encryption/decryption techniques, EAR packaging, KeyTools, digital signatures, and encoding/decoding mechanisms. Hands-on experience with design and prototyping tools such as Mockups, Draw.io, and Enterprise Architect to create process diagrams, solution designs, and documentation. Working knowledge of DevOps tools and practices, including Continuous Integration and Continuous Deployment (CI/CD) pipelines. Familiar with tools such as Git/GitHub, Maven, and Jenkins for version control, dependency management, and automated builds. In-depth experience with various deployment models, including clustered and non-clustered environments, multi-zone deployments, disaster recovery (DR) setups, DMZ configurations, load balancers, and cloud-based architectures. 3 Non-Technical Strong understanding of creating and maintaining development and quality assurance processes to ensure high standards in project delivery. Knowledge of implementing high availability solutions, including the design and management of real-time and batch processing systems. Self-driven, adaptable, and highly responsible with a strong focus on delivering quality outcomes while managing shifting priorities. Skilled at working closely with cross-functional teams to drive project success, ensuring clear communication and effective prioritization under tight deadlines. Adept at balancing multiple tasks and working effectively under pressure, with the ability to prioritize tasks and deliver results within strict timelines.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2