Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Role Summary Pfizer’s purpose is to deliver breakthroughs that change patients’ lives. Research and Development is at the heart of fulfilling Pfizer’s purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world. Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. The successful candidate will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes. Role Responsibilities Lead data modeling and engineering efforts within advanced data platforms teams to achieve digital outcomes. Provides guidance and may lead/co-lead moderately complex projects. Oversee the development and execution of test plans, creation of test scripts, and thorough data validation processes. Lead the architecture, design, and implementation of Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs. Lead the development of complex data products that benefit PGS and ensure reusability across the enterprise. Collaborate effectively with contractors to deliver technical enhancements. Oversee the development of automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment. Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency. Conduct root cause analysis and address production data issues. Lead the design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives. Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects. Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives. Document and present findings, methodologies, and project outcomes to various stakeholders. Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery. Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection. Basic Qualifications A bachelor's or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline. Over 4 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations. Over 2 years of experience in AI, machine learning, and large language models (LLMs) development and deployment. Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred. Strong understanding of data structures, algorithms, and software design principles Programming Languages: Proficiency in Python, SQL, and familiarity with Java or Scala AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices Preferred Qualifications Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica. Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing. Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Containerization: Understanding of Docker and Kubernetes for containerization and orchestration. Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files. Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune. Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets. Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch. Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship. Non-standard Work Schedule, Travel Or Environment Requirements Occasionally travel required Work Location Assignment: Hybrid The annual base salary for this position ranges from $96,300.00 to $160,500.00. In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 12.5% of the base salary and eligibility to participate in our share based long term incentive program. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States. Relocation assistance may be available based on business needs and/or eligibility. Sunshine Act Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations. These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure. Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act. Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government. If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative. EEO & Employment Eligibility Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status. Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA. Pfizer is an E-Verify employer. This position requires permanent work authorization in the United States. Information & Business Tech Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
India
On-site
Experience : 3 to 5 years of experience in ETL development and data engineering Job Location : Noida, India Position Overview: Looking for a Data Engineer specializing in ETL processes to design and implement data pipelines that support data integration and analytics initiatives. Key Responsibilities: Design, develop, and maintain ETL pipelines to extract, transform, and load data from various sources. Collaborate with data analysts and stakeholders to understand data requirements. Ensure data quality and integrity through validation and cleansing processes. Optimize ETL processes for performance and scalability. Document data workflows and maintain metadata repositories. Qualifications: Bachelor’s degree in Computer Science, Information Systems, or related field. 3+ years of experience in ETL development and data engineering. Proficiency in SQL and experience with ETL tools like Informatica, Talend, or SSIS. Familiarity with data warehousing concepts and cloud platforms such as AWS or Azure. Strong problem-solving skills and attention to detail If you’re passionate about customer success and eager to make an impact in a dynamic organization, we’d love to hear from you! We are an equal opportunity employer Demand AI was formed through the strategic merger of Consilior AI and B2B Matters, bringing together advanced AI-driven data solutions and high-performance demand generation expertise under one unified brand. At the core of Demand AI is a uniquely skilled team with deep roots in the lead generation industry. Before joining forces at Consilior AI and B2B Matters, our founding team worked together at Selling Simplified Group, where they built enduring client relationships and consistently delivered measurable growth. This shared history gives us a deep understanding of what clients need and how to deliver it with precision. Our solutions are powered by a proprietary, first-party global database of over 73 million contacts, enabling us to provide smarter, faster, and more connected customer acquisition for modern sales and marketing teams. We believe in fostering a diverse, inclusive, and collaborative workplace. If you’re passionate about transforming how businesses grow through intelligent data and AI, we’d love to hear from you. Learn more at https://demandai.net How to Apply Please send your resume, portfolio (if available), and a brief cover letter outlining your experience to jobs-dev@demandai.co or apply through our careers page at https://demandai.net/career/ Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
I am thrilled to share an exciting opportunity with one of our esteemed clients! 🚀 Join me in exploring new horizons and unlocking potential. If you're ready for a challenge and growth,. Exp: 7yrs Location: Chennai/Hyderabad/Coimbatore immediate to 30days JD: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 4+ years of experience in-depth data engineering, with at least 1+ minimum year(s) of dedicated experience engineering solutions in an enterprise scale Snowflake environment. Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques . Strong experience with cloud platforms (preference to Azure) and their data services. Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran. Hands-on experience with scripting languages like Python for data processing. Snowflake SnowPro certification ; preference to the engineering course path. Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. Familiarity with BI and visualization tools such as PowerBI Regards R Usha usha@livecjobs.com Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities We are seeking a skilled and experienced Cognos TM1 Developer with a strong background in ETL processes and Python development. The ideal candidate will be responsible for designing, developing, and supporting TM1 solutions, integrating data pipelines, and automating processes using Python. This role requires strong problem-solving skills, business acumen, and the ability to work collaboratively with cross-functional teams Preferred Education Master's Degree Required Technical And Professional Expertise 4+ years of hands-on experience with IBM Cognos TM1 / Planning Analytics. Strong knowledge of TI processes, rules, dimensions, cubes, and TM1 Web. Proven experience in building and managing ETL pipelines (preferably with tools like Informatica, Talend, or custom scripts). Proficiency in Python programming for automation, data processing, and system integration. Experience with REST APIs, JSON/XML data formats, and data extraction from external sources Preferred Technical And Professional Experience strong SQL knowledge and ability to work with relational databases. Familiarity with Agile methodologies and version control systems (e.g., Git). 3.Excellent analytical, problem-solving, and communication skills Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Lead Data Engineer – C12 / Assistant Vice President (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 8 to 12 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills:Technology->Life Sciences->LIMS Experience in developing instrument drivers using SDMS/Talend/Java is to have. At least 5 years of experience in software development life cycle. At least 5 years of experience in Project life cycle activities on development and maintenance projects. At least 5 years of experience in Design and architecture review. Good understanding of sample management domain and exposure to life sciences projects Ability to work in team in diverse/ multiple stakeholder environment Analytical skills Very Good Communication skills At least 4-8 years of experience in LIMS(LV/LW) – Implementation /Configuration/Customization using Java, Java script. integration with Lab applications, and should have implemented at least 2-3 projects with role involving development using LabVantage platform and Jasper/iReport/Java reporting tool Interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation and translating to functional and/or design documentation for LabVantage LIMS solution, Application Architecture definition and Design, Development, Validation and release. Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Show more Show less
Posted 1 month ago
0.0 - 2.0 years
0 Lacs
Pune
On-site
The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Essential: Experience primarily with SAP Business Objects and Talend ETL Experience with any one other Business Intelligence tools like Tableau/COGNOS, ETL tools - Abinitio/Spark and unix shell scripting Experience in RDBMS, preferably Oracle with SQL query writing skills. Good understating of Data-warehousing concepts like Schema, Facts/Dimensions. Should be able to understand and modify complex universe queries, design and use the functionalities of Web Intelligence tool. Familiarity with identification and resolution of data quality issues. Strong and effective inter-personal and communication skills and the ability to interact professionally with a business user. Great team player with a passion to collaborate with colleagues. Knowledge of any application server (Weblogic, WAS, Tomcat etc) Adjacent Skills: Apache Spark with java Good Understanding of Bigdata and Hadoop ecosystem Good Understanding of Hive and Impala Testing frameworks (test driven development) Good communication skills Knowledge of Maven, Python scripting skills Good problem solving skills Beneficial: EMS, Kafka, Domain Knowledge - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 month ago
0 years
2 - 7 Lacs
Bengaluru
On-site
QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Performing all testing activities for initiatives across one or more assigned projects, utilizing processes, methods, metrics and software that ensure the quality, reliability and systems safety and security and Hoverfly Component and contract Testing Embedded Cassandra Component Testing on multiple repositor. QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Test strategy formulation will include decomposing the business and technical requirements into test case scenarios, defining test data requirements, managing test case creation, devising contingencies plans and other preparation activities. Development of the test case execution plan, test case execution, managing issues, and status metrics. Working with a global team and responsible for directing/reviewing the test planning and execution work efforts of an offshore team. Communicating effectively with business units, IT Development, Project Management and other support staff on testing timelines, deliverables, status and other information. Assisting in the project quality reviews for your assigned applications Assessing risk to the project based on the execution and validation and making appropriate recommendations Ability to interpret quality audits, drive improvements and change, and facilitate test methodology discussions across the business unit Providing project implementation support on an as needed basis and Assisting with application training of new resources. Ability to create and manage project plans and activity timelines and Investigating, monitoring, reporting and driving solutions to issues Acting as a liaison between the Line of Business testing resources and the development team Identifying and creating risk mitigation activities, and developing and implementing process improvements. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
10.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
Remote
Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. Preferred job location: Bengaluru, Hyderabad, Pune, New Delhi or Remote The team Hitachi Digital is a leader in digital transformation, leveraging advanced AI and data technologies to drive innovation and efficiency across various operational companies (OpCos) and departments. We are seeking a highly experienced Lead Data Integration Architect to join our dynamic team and contribute to the development of robust data integration solutions. The role Lead the design, development, and implementation of data integration solutions using SnapLogic, MuleSoft, or Pentaho. Develop and optimize data integration workflows and pipelines. Collaborate with cross-functional teams to integrate data solutions into existing systems and workflows. Implement and integrate VectorAI and Agent Workspace for Google Gemini into data solutions. Conduct research and stay updated on the latest advancements in data integration technologies. Troubleshoot and resolve complex issues related to data integration systems and applications. Document development processes, methodologies, and best practices. Mentor junior developers and participate in code reviews, providing constructive feedback to team members. Provide strategic direction and leadership in data integration and technology adoption. What you’ll bring Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 10+ years of experience in data integration, preferably in the Banking or Finance industry. Extensive experience with SnapLogic, MuleSoft, or Pentaho (at least one is a must). Experience with Talend and Alation is a plus. Strong programming skills in languages such as Python, Java, or SQL. Technical proficiency in data integration tools and platforms. Knowledge of cloud platforms, particularly Google Cloud Platform (GCP). Experience with VectorAI and Agent Workspace for Google Gemini. Comprehensive knowledge of financial products, regulatory reporting, credit risk, and counterparty risk. Prior strategy consulting experience with a focus on change management and program delivery preferred. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Proven leadership skills and experience in guiding development projects from conception to deployment. Preferred Qualifications Familiarity with data engineering tools and techniques. Previous experience in a similar role within a tech-driven company. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less
Posted 1 month ago
7.0 - 15.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
DataArchitecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. DataModeling:Createand managelogical, physical, and conceptual data models to support various business applications and analytics. DatabaseDesign: Design and implement database solutions, including data warehouses, data lakes, and operational databases. DataIntegration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. DataGovernance:Implementand enforce data governance policies and procedures to ensure data quality, consistency, and security. TechnologyEvaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the world’s leading brands Documentation:Createand maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. PerformanceTuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements Helpingproject teams withsolutions architecture,troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15yearsofexperienceindataarchitecture or related roles. Experiencewithbig data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertisewithcloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledgeofdataintegration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understandingofdatawarehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experiencewithdata governanceframeworks and tools. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
Overview Join global organization with 82000+ employees around the world, as a ETL Data Brick Developer role based in IQVIA Bangalore. You will be part of IQVIA’s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor’s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
Remote
Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. Preferred job location: Bengaluru, Hyderabad, Pune, New Delhi or Remote The team Hitachi Digital is a leader in digital transformation, leveraging advanced AI and data technologies to drive innovation and efficiency across various operational companies (OpCos) and departments. We are seeking a highly experienced Lead Data Integration Architect to join our dynamic team and contribute to the development of robust data integration solutions. The role Lead the design, development, and implementation of data integration solutions using SnapLogic, MuleSoft, or Pentaho. Develop and optimize data integration workflows and pipelines. Collaborate with cross-functional teams to integrate data solutions into existing systems and workflows. Implement and integrate VectorAI and Agent Workspace for Google Gemini into data solutions. Conduct research and stay updated on the latest advancements in data integration technologies. Troubleshoot and resolve complex issues related to data integration systems and applications. Document development processes, methodologies, and best practices. Mentor junior developers and participate in code reviews, providing constructive feedback to team members. Provide strategic direction and leadership in data integration and technology adoption. What you’ll bring Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 10+ years of experience in data integration, preferably in the Banking or Finance industry. Extensive experience with SnapLogic, MuleSoft, or Pentaho (at least one is a must). Experience with Talend and Alation is a plus. Strong programming skills in languages such as Python, Java, or SQL. Technical proficiency in data integration tools and platforms. Knowledge of cloud platforms, particularly Google Cloud Platform (GCP). Experience with VectorAI and Agent Workspace for Google Gemini. Comprehensive knowledge of financial products, regulatory reporting, credit risk, and counterparty risk. Prior strategy consulting experience with a focus on change management and program delivery preferred. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Proven leadership skills and experience in guiding development projects from conception to deployment. Preferred Qualifications Familiarity with data engineering tools and techniques. Previous experience in a similar role within a tech-driven company. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
Remote
Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. Preferred job location: Bengaluru, Hyderabad, Pune, New Delhi or Remote The team Hitachi Digital is a leader in digital transformation, leveraging advanced AI and data technologies to drive innovation and efficiency across various operational companies (OpCos) and departments. We are seeking a highly experienced Lead Data Integration Architect to join our dynamic team and contribute to the development of robust data integration solutions. The role Lead the design, development, and implementation of data integration solutions using SnapLogic, MuleSoft, or Pentaho. Develop and optimize data integration workflows and pipelines. Collaborate with cross-functional teams to integrate data solutions into existing systems and workflows. Implement and integrate VectorAI and Agent Workspace for Google Gemini into data solutions. Conduct research and stay updated on the latest advancements in data integration technologies. Troubleshoot and resolve complex issues related to data integration systems and applications. Document development processes, methodologies, and best practices. Mentor junior developers and participate in code reviews, providing constructive feedback to team members. Provide strategic direction and leadership in data integration and technology adoption. What you’ll bring Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 10+ years of experience in data integration, preferably in the Banking or Finance industry. Extensive experience with SnapLogic, MuleSoft, or Pentaho (at least one is a must). Experience with Talend and Alation is a plus. Strong programming skills in languages such as Python, Java, or SQL. Technical proficiency in data integration tools and platforms. Knowledge of cloud platforms, particularly Google Cloud Platform (GCP). Experience with VectorAI and Agent Workspace for Google Gemini. Comprehensive knowledge of financial products, regulatory reporting, credit risk, and counterparty risk. Prior strategy consulting experience with a focus on change management and program delivery preferred. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Proven leadership skills and experience in guiding development projects from conception to deployment. Preferred Qualifications Familiarity with data engineering tools and techniques. Previous experience in a similar role within a tech-driven company. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Job Description Job Title – ETL Testing – Python & SQL Candidate Specification – 5+ years, Open for Shift – 1PM to 10 PM. ETL (Python) – all 5 days WFO, ETL (SQL) – Hybrid. Location – Chennai. Job Description Experience in ETL testing or data warehouse testing. Strong in SQL Server, MySQL, or Snowflake. Strong in scripting languages Python. Strong understanding of data warehousing concepts, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling. Proficient in writing SQL queries for data validation and reconciliation. Experience with testing tools such as HP ALM, JIRA, TestRail, or similar. Excellent problem-solving skills and attention to detail. Skills Required RoleETL Testing Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills ETL PYTHON SQL Other Information Job CodeGO/JC/185/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute In this role of Kinaxis Analyst you will work closely with IT and business users globally to manage end-to-end supply chain planning process using Kinaxis Rapid Response. You will have the ability to contribute and make a difference in a challenging and exciting Hi-Tech environment. Collaborate with users to understand business requirements and configure solutions in Rapid Response Design, Build, Test and Deploy Rapid Response resources including Workbooks, Alerts, Forms, Scripts, Automation Chains, Widgets, Dashboard etc. Demonstrated understanding of Control Tables, Data Model and Mappings that can be leveraged in solving business problems` As a core member of Kinaxis COE team, apply best practices in Rapid Response configuration to meet business needs Assist end users with interpreting the planning results Facilitate, plan and support new functionality releases including periodic Kinaxis service updates. The Must Haves Strong understanding of MRP and RR analytics Deep understanding of RR data integration using TALEND or similar ETL tools Excellent analytical and troubleshooting skills Exceptional interpersonal skills and ability to communicate effectively, both verbal and in writing Ability to manage multiple priorities and perform well in a fast-paced environment Strong work ethic and high attention to detail Bachelors in science, Technology, Engineering, Mathematics or related field Minimum 4-6 years of firsthand experience with Kinaxis Rapid Response or similar planning tool Assets Kinaxis certification (level 2 and above) preferred. APICS certification a plus Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
Job Description – Snowflake, Talend, Strong experience in SQL development and query optimization, Understanding of database indexing, partitioning, and performance tuning Data warehousing concepts including Star Schema and Dimension Modeling, Dimension tables for analytics and reporting, Data partitioning and performance tuning techniques in DWH environments TeamCity, Jenkins, GIT, ETL Scripts, DB Objects experience, Job schedulers(execution, monitoring), Branching, Merging, Automated deployment for ETL processes Secondary - Excellent Cloud Exposure Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Manager - Guidewire Data Manager EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We are seeking an experienced Guidewire Manager with a strong background in the insurance domain and extensive knowledge of traditional ETL tools. The ideal candidate will have a robust understanding of data warehousing architecture and hands-on experience with various ETL tools, including Informatica PowerCenter, SSIS, SAP BODS, and Talend. Your Key Responsibilities Lead and manage Guidewire implementation projects, ensuring alignment with business objectives and technical requirements. Oversee the design, development, and maintenance of data warehousing solutions. Collaborate with cross-functional teams to gather and analyze business requirements. Develop and implement ETL processes using tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Ensure data quality, integrity, and security across all data warehousing and ETL processes. Provide technical guidance and mentorship to team members. Stay updated with industry trends and best practices in data warehousing and ETL. Skills And Attributes For Success Bachelor's degree in Computer Science, Information Technology, or a related field. 8-11 years of experience in data warehousing and ETL processes. Strong background in the insurance domain. Hands-on experience with ETL tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Excellent understanding of data warehousing architecture and best practices. Proven leadership and project management skills. Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. To qualify for the role, you must have Experience with Guidewire implementation projects. Knowledge of additional ETL tools and technologies. Certification in relevant ETL tools or data warehousing technologies. Why EY? At EY, we offer a dynamic and inclusive work environment where you can grow your career and make a meaningful impact. Join us to work on challenging projects, collaborate with talented professionals, and contribute to innovative solutions in the insurance domain. Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about Life insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
About This Role Aladdin Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin. Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, architect, develop, deploy, and maintain data pipelines (ETL / ELT). Typically, they work in a multi-disciplinary squad (we follow Agile!) which involves partnering with program and product managers to expand product offering based on business demands. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up user engagement and adoption of the platform while constantly working towards modernizing and improving platform performance and scalability. Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Production support for applications is usually required for issues that cannot be resolved by operations team. Creative and inventive problem-solving skills for reduced turnaround times are highly valued. Preparing user documentation to maintain both development and operations continuity is integral to the role. And Ideal candidate would have At least 4+ years’ experience as a data engineer Experience in SQL, Sybase, Linux is a must Experience coding in two of these languages for server side/data processing is required Java, Python, C++ 2+ years experience using modern data stack (spark, snowflake, Big Query etc.) on cloud platforms (Azure, GCP, AWS) Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus) Experience with Database Modeling, Normalization techniques Experience with object-oriented design patterns Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps Experience with Agile development concepts and related tools Ability to trouble shoot and fix performance issues across the codebase and database queries Excellent written and verbal communication skills Ability to operate in a fast-paced environment Strong interpersonal skills with a can-do attitude under challenging circumstances BA/BS or equivalent practical experience Skills That Would Be a Plus Perl, ETL tools (Informatica, Talend, dbt etc.) Experience with Snowflake or other Cloud Data warehousing products Exposure with Workflow management tools such as Airflow Exposure to messaging platforms such as Kafka Exposure to NoSQL platforms such as Cassandra, MongoDB Building and Delivering REST APIs Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
India
Remote
We're Hiring: Integration Consultant (Kinaxis Maestro) Work from Anywhere | Remote Opportunity Full-Time | 2–4 Years Experience Are you a Kinaxis Maestro pro with a passion for building seamless integrations that drive supply chain transformation? Join Simbus Tech, a trusted Kinaxis partner, and work with global clients to enable smarter, faster supply chain decisions. What You’ll Do Design and implement robust integration solutions for Kinaxis RapidResponse using Maestro. Develop and maintain data pipelines using APIs, ETL tools, and middleware. Collaborate with functional and technical teams to align integration strategies with business needs. Troubleshoot integration issues and ensure data consistency across systems. Contribute to continuous improvement and documentation of integration best practices. What We’re Looking For Bachelor’s degree in Computer Science, Information Technology, or a related field. 2–4 years of experience as a Kinaxis Integration Consultant or similar role. Strong hands-on expertise in Kinaxis Maestro and integration frameworks. Proficiency in APIs, ETL processes, and tools like Talend, Mulesoft, or Snowflake. Relevant certifications in Kinaxis Maestro or integration technologies (preferred). Strong problem-solving skills, communication, and a collaborative mindset. Why Join Simbus Tech? 100% Remote | Work from Anywhere Exposure to cutting-edge supply chain technologies Collaborate with top-tier global clients Growth-driven, people-first culture Show more Show less
Posted 1 month ago
0.0 - 2.0 years
0 Lacs
India
On-site
We’re Hiring: Data Engineer About The Job Duration: 12 Months Location: PAN INDIA Timings: Full Time (As per company timings) Notice Period: within 15 days or immediate joiner Experience: 0- 2 years Responsibilities Job Description Design, develop and maintain reliable automated data solutions based on the identification, collection and evaluation of business requirements. Including but not limited to data models, database objects, stored procedures and views. Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components Support and troubleshoot the data environment (including periodically on call) Document technical artifacts for developed solutions Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills Proactive approach to problem solving with effective influencing skills Familiar with Agile practices and methodologies Education And Experience Requirements Four-year degree in Information Systems, Finance / Mathematics, Computer Science or similar 0-2 years of experience in Data Engineering REQUIRED KNOWLEDGE, SKILLS Or ABILITIES Advanced SQL queries, scripts, stored procedures, materialized views, and views Focus on ELT to load data into database and perform transformations in database Ability to use analytical SQL functions Snowflake experience a plus Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modelling, analysis, programming Experience with DevOps models utilizing a CI/CD tool Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Talend, Apache Airflow, Azure Data Factory, and BI tools like Tableau preferred Analyse data models We are looking for a Senior Data Engineer for the Enterprise Data Organization to build and manage data pipeline (Data ingest, data transformation, data distribution, quality rules, data storage etc.) for Azure cloud-based data platform. The candidate will require to possess strong technical, analytical, programming and critical thinking skills. Show more Show less
Posted 1 month ago
10.0 - 15.0 years
30 - 35 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Data Strategy & Data Governance Manager Join our team in Technology Strategy for an exciting career opportunity to enable our most strategic clients to realize exceptional business value from technology Practice: Technology Strategy & Advisory, Capability Network Areas of Work: Data Strategy Level: Manager | Location: Bangalore / Gurgaon / Mumbai / Pune / Chennai / Hyderabad / Kolkata | Years of Exp: 10 to 15 years Explore an Exciting Career at Accenture Do you believe in creating an impactAre you a problem solver who enjoys working on transformative strategies for global clientsAre you passionate about being part of an inclusive, diverse and collaborative culture Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analyticsand the data that fuels it allto power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for today's connectedlandscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to clients technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients A fair understanding of data platform strategy for data on cloud migrations, big data technologies, large scale data lake and DW on cloud solutions. Utilize strong expertise & certification in any of the Data & Analytics on Cloud platforms Google, Azure or AWS Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualification Your experience counts! MBA from a tier 1 institute 5 7 years of Strategy Consulting experience at a consulting firm 3+ years of experience on projects showcasing skills across these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy At least 2 years of experience on architecting or designing solutions for any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience in one or more technologies in the data governance space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. 3+ years of experience in designing end-to-end Enterprise Data & Analytics Strategic Solutions leveraging Cloud & Non-Cloud platforms like AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera, Informatica, Snowflake, Palantir Deep Understanding of data supply chain and building value realization framework for data transformations 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & Analytics strategy recommendations as well as developing POCs Foundational understanding of data privacy is desired Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred:Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains CDMP Certification from DAMA preferred Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential
Posted 1 month ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Company Overview Viraaj HR Solutions is a dynamic HR consulting firm dedicated to optimizing human resource management and development. Our mission is to bridge the gap between talent and opportunity, driving growth and success for both our clients and candidates. We foster a culture of collaboration, innovation, and integrity, consistently striving to deliver exceptional service in the evolving landscape of human resources. Role Responsibilities Design, develop, and implement ETL processes using Talend. Collaborate with data analysts and stakeholders to understand data requirements. Perform data cleansing and transformation tasks. Optimize and automate existing data integration workflows. Monitor ETL jobs and troubleshoot issues as they arise. Conduct performance tuning of Talend jobs for efficiency. Document ETL processes and maintain technical documentation. Work closely with cross-functional teams to support data needs. Ensure data integrity and accuracy throughout the ETL process. Stay updated with Talend best practices and upcoming features. Assist in the migration of data from legacy systems to new platforms. Participate in code reviews to ensure code quality and adherence to standards. Engage in user training and support as necessary. Provide post-implementation support for deployed solutions. Evaluate and implement new data tools and technologies. Qualifications 3+ years of experience as a Talend Developer. Strong understanding of ETL principles and practices. Proficiency in Talend Open Studio. Hands-on experience with SQL and database management. Familiarity with data warehousing concepts. Experience using Java for Talend scripting. Knowledge of APIs and web services. Effective problem-solving skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and accuracy in data handling. Experience with job scheduling tools. Ability to manage multiple priorities and deadlines. Knowledge of data modeling concepts. Experience in documentation and process mapping. Skills: data cleansing,data warehousing,job scheduling tools,problem solving,team collaboration,sql,documentation,digital : talend open studio,talend,data transformation,data modeling,performance tuning,web services,api development,java,apis,data integration,etl processes Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France