Jobs
Interviews

157 Etl Tools Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The healthcare industry presents a significant opportunity for software development, and Health Catalyst stands out as a leading company in this domain. By joining our team, you have the chance to contribute to solving critical healthcare challenges at a national level, impacting the lives of millions. At Health Catalyst, we value individuals who are intelligent, hardworking, and humble, and we are committed to developing innovative tools to enhance healthcare performance, cost-efficiency, and quality. As a Data Engineer at Health Catalyst, your primary focus will be on acquiring data from various sources within a Health Systems ecosystem. Leveraging Catalyst's Data Operating System, you will work closely with both technical and business aspects of the source systems, utilizing multiple technologies to extract the necessary data. Key Responsibilities include: - Proficiency in Structured Query Language (SQL) and experience with EMR/EHR systems - Leading the design, development, and maintenance of scalable data pipelines and ETL processes - Strong expertise in ETL tools and database principles - Excellent analytical and troubleshooting skills, with a strong customer service orientation - Mentoring and guiding a team of data engineers to foster continuous learning and improvement - Monitoring and resolving data infrastructure issues to ensure high availability and performance - Ensuring data quality, integrity, and security across all data platforms - Implementing best practices for data governance, lineage, and compliance Desired Skills: - Experience with RDBMS (SQL Server, Oracle, etc.) and Stored Procedure/T-SQL/SSIS - Familiarity with processing HL7 messages, CCD documents, and EDI X12 Claims files - Knowledge of Agile development methodologies and the ability to work with technologies related to data acquisition - Proficiency in Hadoop and other Big Data Technologies - Experience with Microsoft Azure cloud solutions, architecture, and related technologies Education & Experience: - Bachelor's degree in technology, business, or a healthcare-related field - Minimum of 5 years of experience in data engineering, with at least 2 years in a leadership role - 2+ years of experience in the healthcare/technology industry If you are passionate about leveraging your expertise in data engineering to make a meaningful impact in the healthcare sector, we encourage you to apply and be a part of our dynamic and innovative team at Health Catalyst.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The Digital Finance function within Novartis is responsible for driving the data and tools, AI and ML strategy for finance across the organization. We are seeking a highly skilled and detail-oriented Senior Manager, Data Analytics. Your primary responsibility in this role is to effectively manage and analyze large volumes of data, aiming to extract valuable insights. You should possess exceptional analytical skills and the ability to manipulate and interpret complex datasets using various technologies, including programming languages. Additionally, you should be able to build strong relationships with global stakeholders and effectively communicate business analyst needs to data engineers or software engineers. Key Requirements: - **Data Analysis:** Utilize statistical techniques and data mining methods to analyze large datasets. Identify trends, patterns, and correlations to uncover valuable insights and address critical business questions. - **Data Validation, Cleaning and Preprocessing:** Thoroughly clean, validate, and organize data to ensure accuracy and consistency. Identify and resolve data quality issues, such as missing values and outliers, to maintain data integrity. - **Data Visualization:** Present findings clearly through charts, graphs, and visualizations. Enable users to understand and interpret data effectively through visual representations. - **Ad-hoc Data Analysis:** Conduct ad-hoc data analysis to provide actionable recommendations. Proactively address emerging business needs by performing on-demand data analysis to support decision-making processes. - **Collaboration:** Work closely with cross-functional teams, including business analysts, data engineers, and data scientists. Understand their requirements and provide data-driven solutions, fostering effective teamwork and communication. - **Storytelling:** Generate insights from data and communicate them to key stakeholders, both technical and non-technical. Understand and explain the implications of the findings on processes, products, or business. - **Documentation:** Develop and maintain documentation related to data analysis processes. Ensure clear and comprehensive documentation to facilitate knowledge sharing and replication of analysis methods. - **Continuous Improvement:** Stay up-to-date with the latest trends and techniques in data analytics. Continuously explore and implement new tools and technologies to enhance data analysis processes, aligning with industry best practices. Essential Requirements: - Strong analytical skills with the ability to manipulate and interpret large datasets, knowledge of statistical techniques and data mining algorithms. - Proficiency in programming languages (e.g., Python) and strong SQL with an aptitude for learning other analytics tools. - Experience with BI tools like PowerBI or Qlik and ETL tools like Dataiku/Alteryx. - Effective communication skills to present findings to both technical and non-technical stakeholders. Desirable Requirements: - 5+ years of extensive experience in Data and/or Business Analytics, candidates with a background in Financial Data Analytics will be an added advantage. - Bachelor's or Master's degree in a relevant field such as mathematics, statistics, economics, or computer science. - Strong written and verbal communication skills. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients" lives. Ready to create a brighter future together Join our Novartis Network: Not the right Novartis role for you Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up. Benefits and Rewards: Read our handbook to learn about all the ways we'll help you thrive personally and professionally.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

telangana

On-site

You will provide analytics support to Novartis internal customers (CPOs & Regional marketing and sales teams) on various low-medium complexity analytical reports. You will support and facilitate data-enabled decision-making for Novartis internal customers by providing and communicating qualitative and quantitative analytics. Additionally, you will support GBS - GCO business in building practice by involving in various initiatives like knowledge sharing, on-boarding and training support, supporting team lead in all business-related tasks/activities, building process documentation, and knowledge repositories. You will also be an integral part of a comprehensive design team responsible for designing promotional marketing materials. As an Analyst at Novartis, your key responsibilities will include creating and delivering Field Excellence insights as per agreed SLAs, designing, developing, and/or maintaining ETL based solutions that optimize field excellence activities, delivering services through an Agile project management approach, maintaining standard operating procedures (SOPs) and quality checklists, and developing and maintaining knowledge repositories collecting qualitative and quantitative data of field excellence related trends across Novartis operating markets. Essential requirements for this role include 2 years of experience in SQL and Excel, learning agility, the ability to manage multiple stakeholders, experience in Pharma datasets, and experience in Python or any other scripting language. Desirable requirements include a University/Advanced degree, ideally a Masters degree or equivalent experience in fields such as business administration, finance, computer science, or a technical field. Experience of at least 3 years in using ETL tools (Alteryx, DataIKU, Matillion, etc.) and hands-on experience with cloud-based platforms like SnowFlake is mandatory. Novartis's purpose is to reimagine medicine to improve and extend people's lives, with a vision to become the most valued and trusted medicines company in the world. By joining Novartis, you will be a part of a mission-driven organization where associates drive the company to reach its ambitions. If you are passionate about making a difference in patients" lives and want to be part of a community of smart and dedicated individuals, consider joining Novartis. For more information about benefits and rewards at Novartis, you can refer to the Novartis Life Handbook at https://www.novartis.com/careers/benefits-rewards. If you are interested in staying connected with Novartis and learning about future career opportunities, you can join the Novartis Network here: https://talentnetwork.novartis.com/network.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer, you will be responsible for designing, building, and maintaining data pipelines to ensure data quality and reliability. Your main tasks will include data cleansing, imputation, mapping to standard data models, transforming data to meet business rules and statistical computations, and validating data content. You will be developing, modifying, and maintaining Python and Unix scripts, as well as complex SQL queries. Performance tuning of existing code to enhance efficiency and avoid bottlenecks will be a key part of your role. Building end-to-end data flows from various sources to curated and enhanced datasets will be crucial. Additionally, you will develop automated Python jobs for data ingestion and provide technical expertise in architecture, design, and implementation. Collaborating with team members, you will create insightful reports and dashboards to improve processes and add value. SQL query writing for data validation and designing ETL processes for data extraction, transformation, and loading will also be part of your responsibilities. You will work closely with data architects, analysts, and stakeholders to understand data requirements and ensure data quality. Optimizing and tuning ETL processes for performance and scalability will be essential. Maintaining documentation for ETL processes, data flows, and mappings, as well as monitoring and troubleshooting ETL processes to ensure data accuracy and availability, will be key responsibilities. Implementing data validation and error handling mechanisms to maintain data integrity and consistency will also be part of your role. Required Skills: - Python - ETL Tools like Informatica, Talend, SSIS, or similar - SQL, MySQL - Expertise in Oracle, SQL Server, and Teradata - DevOps, GitLab - Experience in AWS Glue or Azure Data Factory If you are passionate about data engineering and have the skills mentioned above, we would love to have you on our team!,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

At EY, you will have the opportunity to shape a career that aligns with your unique qualities, supported by a global network, inclusive environment, and cutting-edge technology to empower you to reach your full potential. Your distinctive voice and perspective are valued as we look to enhance EY's performance further. By joining us, you will craft a remarkable journey for yourself while contributing to the creation of a more efficient working world for all. As a Salesforce Junior Developer, with 2+ years of total experience and 2+ years of relevant experience, your responsibilities will include: - Engaging in application design, configuration, testing, and deployment processes - Customizing and configuring the Salesforce.com platform - Collaborating on testing, training, and documentation efforts - Supporting the sales cycle when required, from solution definition to project planning - Generating tangible deliverables and providing technical support for bug fixes and enhancements - Establishing a development environment and potentially guiding a team of junior developers In terms of knowledge and skills, you should possess: - 2+ years of experience working on Salesforce platforms with Salesforce Platform Developer I certification - Previous involvement in CRM projects for mid-market and enterprise-level companies - Proficiency in Force.com Platform tools such as Apex, LWC, SOQL, and Unit Testing - Familiarity with core web technologies like HTML5, JavaScript, and jQuery - Experience in relational databases, data modeling, and ETL tools - Exposure to web services (REST & SOAP, JSON & XML, etc.) - Knowledge of Agile development methodologies like SCRUM - Strong verbal and written communication abilities EY is dedicated to building a better working world by delivering long-term value for clients, employees, and society while fostering trust in the capital markets. With the support of data and technology, diverse teams across 150+ countries offer assurance and aid clients in growth, transformation, and operation. By operating in assurance, consulting, law, strategy, tax, and transactions, EY teams tackle complex global challenges by asking insightful questions and offering innovative solutions.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

You are seeking an experienced Boomi Architect to lead the design, development, and deployment of integration solutions using the Dell Boomi AtomSphere platform. As a Boomi Architect, you will be responsible for architecting scalable, secure, and high-performance integration solutions that connect cloud and on-premise applications and data. The ideal candidate should possess 8+ years of IT experience, with at least 3 years in Boomi architecture and development. Your expertise in Boomi AtomSphere, including Process Building, API Management, EDI, and Master Data Hub, will be crucial for this role. You should have a strong understanding of integration patterns, REST/SOAP web services, APIs, and cloud platforms. Experience with middleware platforms, ETL tools, and enterprise applications such as Salesforce, NetSuite, and SAP is required. Proficiency in data formats like JSON, XML, CSV, and data mapping/transformation is essential. Additionally, you should have experience with error handling, logging, retry mechanisms, and monitoring tools. A strong knowledge of authentication mechanisms like OAuth2, JWT, and Basic Auth is also necessary. Experience in CI/CD practices for integration deployments is preferred. Problem-solving skills, communication skills, and the ability to work with internal teams and external vendors are important aspects of this role. As a Boomi Architect, your responsibilities will include leading the design and architecture of end-to-end integration solutions using Dell Boomi, translating business requirements into technical specifications, defining and implementing integration best practices, developing integration architecture documents, overseeing Boomi process development, and collaborating with internal teams and external vendors to ensure alignment with enterprise architecture. At GlobalLogic, we prioritize a culture of caring and offer continuous learning and development opportunities. You will have the chance to work on interesting and meaningful projects that have a real impact. We believe in balance and flexibility, offering various work arrangements to help you achieve a healthy work-life balance. Join us in a high-trust organization where integrity is key, and trust is a cornerstone of our values.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a talented professional with 2+ years of directly relevant experience, you will be responsible for utilizing your expertise in Python, SQL, ETL Tools, Machine Learning, Dashboard (Power BI, Tableau), Market Research, and Credit Analysis to drive innovative solutions for our clients. Your high competence in stakeholder management, time management, and project management skills will be essential in ensuring successful project delivery. Additionally, your strong written and verbal communication skills will play a crucial role in effectively collaborating with team members and clients. Your excellent analytical and problem-solving skills will be instrumental in identifying and addressing business challenges. To excel in this role, you must be a graduate with a passion for leveraging technology and analytics to transform business processes. The ability to work night shifts or rotational shifts is required to meet the demands of the role and align with our operational excellence standards. If you are ready to co-create and execute the future vision of our clients in collaboration with a diverse team of 44,000+ employees, we invite you to join us in re-imagining the digital future of businesses across various industries.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a System Analyst Datawarehouse at our company, you will be responsible for collaborating with stakeholders to understand business requirements and translate them into data warehouse design specifications. Your role will involve developing and maintaining data warehouse architecture, including data models, ETL processes, and data integration strategies. You will create, optimize, and manage ETL processes to extract, transform, and load data from various source systems into the data warehouse. Ensuring data quality and accuracy during the ETL process by implementing data cleansing and validation procedures will be a key part of your responsibilities. Designing and maintaining data models, schemas, and hierarchies to support efficient data retrieval and reporting will be crucial. You will implement best practices for data modeling, including star schemas, snowflake schemas, and dimension tables. Integrating data from multiple sources, both structured and unstructured, into the data warehouse will be part of your daily tasks. You will work with API endpoints, databases, and flat files to collect and process data efficiently. Monitoring and optimizing the performance of the data warehouse, identifying and resolving bottlenecks and performance issues, will be essential. You will implement indexing, partitioning, and caching strategies for improved query performance. Enforcing data governance policies and security measures to protect sensitive data within the data warehouse will be a priority. You will ensure compliance with data privacy regulations, such as GDPR or HIPAA. Collaborating with business intelligence teams to provide support for reporting and analytics initiatives will also be part of your role. You will assist in the creation of data marts and dashboards for end-users. Maintaining comprehensive documentation of data warehouse processes, data models, and ETL workflows will be crucial. Additionally, you will train and mentor junior data analysts and team members. To qualify for this role, you should have a Bachelor's degree in computer science, information technology, or a related field. A minimum of 3 years of experience as a Data Warehouse Systems Analyst is required. Strong expertise in data warehousing concepts, methodologies, and tools, as well as proficiency in SQL, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling techniques, are essential. Knowledge of data governance, data security, and compliance best practices is necessary. Excellent problem-solving and analytical skills, along with strong communication and interpersonal skills for effective collaboration with cross-functional teams, will be beneficial in this role. Immediate joiners will be preferable for this position. If you meet the qualifications and are looking to join a dynamic team in Mumbai, we encourage you to apply.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be providing analytics support to Novartis US sales and Marketing teams on various Business Intelligence reporting and Data Visualization projects. Your role will involve supporting and facilitating data-enabled decision-making for Novartis internal customers by communicating qualitative and quantitative analytics and generating insights faster using data visualization. Your key responsibilities will include providing business intelligence, analytics, and insights support to drive Field, Home Office, and Enterprise reporting and data visualization for Novartis US. You will create and deliver field excellence reports and insights according to agreed SLAs, ensuring timeliness, accuracy, quality, etc., to drive excellent customer service. Additionally, you will design, develop, and/or maintain Power BI-based dashboard solutions that optimize field excellence activities based on country commercial excellence needs through a variety of evolving infrastructure landscapes. To excel in this role, you must have a growth mindset and be open to enhancing your skill set by learning new data modeling tools based on available business needs. You will deliver services through a structured project management approach, ensuring appropriate documentation and communication throughout the delivery of reporting services. Creating and maintaining standard operating procedures (SOPs) and quality checklists to enable excellent quality outputs for all deliverables within the function is also a crucial aspect of this role. An essential requirement for this position is a minimum of 5+ years of hands-on experience in data visualization in PowerBI. Exposure to creating reporting/visualization products for Field Users and Commercial Leadership teams and a solid foundation in the Pharma domain (commercial analytics) are also necessary. Basic to intermediate knowledge of Microsoft Excel, PowerPoint, SQL, ETL tools, and other Visualization tools like QlikSense is preferred. You will be responsible for enabling the standardization of processes through process documentation and the timely maintenance of knowledge repositories. Your role will involve facilitating data-enabled decision-making and execution for Novartis internal stakeholders by providing techno-functional expertise in short-term and long-term sales operations and strategy. You will contribute to stakeholder teams by engaging in various initiatives such as knowledge sharing, onboarding, and training support. Project management skills and the ability to deliver independently with less oversight are essential. You should also be capable of guiding analysts/senior analysts in the team for successful delivery. Strong presentation skills, as well as interpersonal and communication skills, are required for this position. Desirable requirements for this role include a Bachelor's or Master's degree/other advanced degree in Lifesciences or Pharmaceutical Sciences & an MBA degree. Previous knowledge and experience in the pharma/life sciences industry are preferred. Learning agility and the ability to manage multiple stakeholders are also beneficial qualities. If you are passionate about helping people with diseases and their families and thrive in a community of smart, passionate individuals, this role at Novartis offers you the opportunity to collaborate, support, and inspire each other to achieve breakthroughs that change patients" lives. Join the Novartis Network to stay connected and learn about suitable career opportunities as they arise. Visit our talent community to explore the benefits and rewards we offer to help you thrive personally and professionally.,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

punjab

On-site

You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of expertise in data integration, warehousing, and analytics. Your role involves having deep technical knowledge in ETL tools, strong data modeling skills, and the capability to lead intricate data engineering projects from inception to implementation. Your key skills include: - Utilizing ETL tools such as SSIS, Informatica, DataStage, or Talend for more than 4 years. - Proficiency in relational databases like SQL Server and MySQL. - Comprehensive understanding of Data Mart/EDW methodologies. - Designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Familiarity with reporting and analytics tools like Tableau and Power BI. - Proficient in scripting and programming using Python. - Knowledge of cloud platforms like AWS or Azure. - Leading recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Working with cross-functional and geographically distributed teams. - Translating complex data issues into actionable insights. - Strong communication and client management abilities. - Initiative-driven with a collaborative approach and problem-solving mindset. Your roles & responsibilities will include: - Creating high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components while ensuring compliance with standards and best practices. - Ensuring delivery quality and timeliness for one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical guidance and support to junior team members for problem-solving. - Leading QA processes for deliverables and validating progress against project timelines. - Managing issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing for data engineering efforts.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have experience in using ETL tools, database management, scripting (primarily Python), API consumption, Source to Target mapping, and Advanced SQL queries. You will be responsible for designing, building, and maintaining scalable data pipelines and architectures on Microsoft Azure cloud platforms. Cloud experience is preferred. Apart from technical skills, you should possess excellent communication skills and the ability to work autonomously and with minimal direction. Your role will involve developing and optimizing complex ETL processes, monitoring system performance, and troubleshooting data-related issues in production environments.,

Posted 1 week ago

Apply

11.0 - 15.0 years

0 Lacs

karnataka

On-site

We are seeking an experienced candidate with a background as a Solutions Architect specializing in Salesforce Platform implementation. The Designation for this role is CRM-SALESFORCE- Salesforce Solutions Architect. The ideal candidate should have a minimum of 11 to 14 years of experience, with a corresponding CTC range of 40 - 42 LPA. This position is based in multiple locations in India, with the expectation of working from the office for 5 days a week. In this role, you will be responsible for leading the design and architecture of end-to-end solutions based on business requirements and technical specifications. This includes creating solution blueprints, architecture diagrams, data modeling, process automation, and system integrations while ensuring scalability, security, and smooth integration with existing systems and platforms. Your expertise in Salesforce features like Sales Cloud, Service Cloud, Marketing Cloud, Experience Cloud, and Platform capabilities will be crucial in evaluating new features and AppExchange products for organizational suitability. Collaboration with stakeholders to align solution architecture with business objectives and technology strategies is a key aspect of this role. You will provide guidance on the adoption of emerging technologies and best practices, define architectural standards, and ensure alignment with organizational goals. Understanding requirements from business stakeholders and translating them into technical solutions will be a core responsibility, along with serving as a trusted advisor for Salesforce capabilities. You will work closely with developers, business analysts, and project managers to ensure successful implementation of solutions. Technical discussions, architecture reviews, and mentoring development teams throughout the project lifecycle are also part of this role. Overseeing system integration, performance monitoring, data integrity, and reliability across integrated systems will be essential, along with designing strategies for system scalability, reliability, and disaster recovery. Documentation and governance play a crucial role in this position, involving the creation and maintenance of detailed technical documentation, adherence to architecture governance processes, and enforcement of coding, security, and data management standards for Salesforce development. Risk management, security, training, and enablement are additional components of this role, requiring the identification and mitigation of architectural risks, ensuring security and compliance, mentoring team members, and delivering technical documentation. The ideal candidate should have over 10 years of experience in software development, solution design, or a similar role. Proven expertise in designing and implementing large-scale solutions, particularly in Salesforce Sales Cloud, Service Cloud, Experience Cloud, and Platform tools, is required. Strong skills in architectural frameworks, design patterns, programming languages, databases, Apex, Visualforce, Lightning Web Components (LWC), Salesforce APIs, integration tools, data modeling, and problem-solving are essential. Additionally, fluent in Spanish and English, Salesforce certifications, experience in DevOps practices, exposure to sectors like finance, healthcare, retail, or technology, familiarity with Salesforce CPQ, Agile methodologies, and additional relevant experience are highly desirable. If this opportunity aligns with your expertise and interests, please feel free to reach out to Ankur Sharma by sharing your resume at ankur.sharma@ubconsulting.in or contact him at 9001513258 (Sr. HR - Talent Acquisition) at Unlock Business Consulting India.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Catalogue Technical Specialist/Analyst, you will be responsible for supporting the technical integration efforts of Alation into other enterprise systems that may contain critical data lineage or data quality information. Your daily tasks will include providing operational support for the integration catalog, managing active directory roles, maintaining integrations, and enhancing capabilities. Collaborating with Integration owners, you will identify, define, and capture integration and other key data within the integration catalog. Your role will also involve optimizing integration descriptions, keywords, and categories for effective search and discovery, as well as resolving issues related to the integration catalog promptly. Maintaining the data catalog to ensure accurate and up-to-date metadata for all data assets will be a crucial part of your responsibilities. You will establish and enforce data quality standards and guidelines across the organization, conducting regular data quality assessments and audits to address any data issues that arise. Additionally, you will act as a point of contact for data catalog-related inquiries, providing timely resolutions. Generating reports and dashboards to offer insights into data catalog usage, data quality, and metadata completeness will be an essential aspect of your role. You will also monitor and analyze data quality metrics to identify and resolve any anomalies or discrepancies, analyzing metadata to identify trends, patterns, and areas for improvement. Your experience as a self-directed individual with over 3 years of experience in integration, data & analytics, or related roles will be beneficial for this position. Experience in integration design and development on IPaaS platforms, DevOps, and CI/CD approach for integration deployment is preferred. Hands-on experience with catalog tools such as Alation or similar, as well as integrating with the ServiceNow platform, will be advantageous. Moreover, your proven experience in implementing and managing data lineage, catalog, or other solutions in complex enterprise environments, along with expertise in databases, business intelligence tools, and ETL tools, will be valuable. A strong understanding of data catalogs and their capabilities, including data dictionaries, business glossaries, business lineage, technical lineage, and data management workflows, is essential for this role. Understanding multiple system integrations, data flow, and data schema changes will also be part of your responsibilities. In summary, as a Catalogue Technical Specialist/Analyst, you will play a pivotal role in ensuring the smooth integration of Alation into various enterprise systems, maintaining data quality standards, and providing support for data catalog-related activities to enhance organizational data management practices.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate will possess the ability to lead in both technical and business-oriented conversations, delivering various Salesforce Einstein Analytics solutions for clients in the US. This role involves the delivery of strategic analytics data solutions and reporting on Salesforce Einstein, requiring monitoring, analyzing data and trends, and effectively communicating business results to clients. Additionally, the candidate will collaborate on strategic initiatives to ensure optimal outcomes. Key Competencies: - Excellent communication skills for client interaction in the US. - 5+ years of consulting or Analytics implementation experience. - 5+ years of Salesforce platform experience. - 2+ full life cycle projects with expertise in Salesforce Einstein implementations. - 2+ years of Salesforce Einstein Analytics experience. - Proficiency in SQL, SAQL, SOQL, and JSON. - Familiarity with BI tools (Domo, Qlik, Microstrategy, Tableau, etc.) and ETL Tools (Informatica, etc.). - Certification in Einstein Analytics and Discovery Consultant. - Bachelor's degree in technology, engineering, or related discipline. Key Responsibilities: - Collaborate with agile matrixed teams at US clients to build Salesforce Einstein solutions supporting business requirements. - Provide expertise in Einstein Analytics application, dataset management, security, visualization/dashboard creation. - Deliver strategic reporting & AI solutions on Salesforce Einstein. - Stay updated on new features of the tool, recommend enhancements, and develop Einstein Analytics applications. - Offer guidance on implementation best practices and perform custom development and integrations. - Assist with technical design, documentation, and drive new technologies like AI and Smart Data Discovery. - Develop knowledge artifacts on Salesforce Einstein and maintain effective communication within the team. - Participate actively in technical discussions and promote a culture of observability for platform performance and stability.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Citrin Cooperman offers a dynamic work environment that fosters professional growth and collaboration. We are continuously seeking talented individuals who bring fresh perspectives, a problem-solving mindset, and sharp technical expertise. Our team of collaborative, innovative professionals is ready to support your professional development. At Citrin Cooperman, we offer competitive compensation and benefits, along with the flexibility to manage your personal and professional life to focus on what matters most to you! As a Financial System Data Integration Senior at Citrin Cooperman, you will play a vital role in supporting the design and development of integrations for clients within Workiva's cloud-based information management platform. Working closely with Citrin Cooperman's Integration Manager, you will be responsible for driving project execution, translating strategic target architecture and business needs into executable designs, and technical system solutions. Your contributions will shape the future of how our clients utilize Workiva's platform to achieve success. Key responsibilities of the role include: - Analyzing requirements to identify optimal use of existing software functionalities for automation solutions - Crafting scalable, flexible, and resilient architectures to address clients" business problems - Supporting end-to-end projects to ensure alignment with original design and objectives - Creating data tables, queries (SQL), ETL logic, and API connections between client source systems and the software platform - Developing technical documentation and identifying technical risks associated with application development - Acting as a visionary in data integration and driving connected data solutions for clients - Providing architectural guidance and recommendations to promote successful technology partner engagements - Mentoring and training colleagues and clients - Communicating extensively with clients to manage expectations and report on project status Required Qualifications: - Bachelor's degree in Computer Science, IT, Management IS, or similar with a minimum of 4 years of experience OR at least 7 years of experience without a degree - Proven ability to lead enterprise-level integration strategy discussions - Expertise with API connectors in ERP Solutions such as SAP, Oracle, NetSuite, etc. - Intermediate proficiency with Python, SQL, JSON, and/or REST - Professional experience with database design, ETL tools, multidimensional reporting software, data warehousing, dashboards, and Excel - Experience in identifying obstacles, managing multiple work streams, and effective communication with technical and non-technical stakeholders Preferred Qualifications: - Experience with Workiva's platform - Understanding of accounting activities - Project management experience and leadership skills - Participation in business development activities - Experience in mentoring and training others At Citrin Cooperman, we are committed to providing exceptional service to clients and acting as positive brand ambassadors. Join us in driving innovation, shaping the future of data integration, and making a meaningful impact on our clients" success.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Data Migration developer is responsible for executing and managing data migration projects within Salesforce environments. This role requires expertise in data extraction, transformation, and loading (ETL) processes, with a strong focus on leveraging Informatica tools. The specialist will ensure the accurate, secure, and efficient migration of data while customizing Salesforce to align with business processes and objectives. You should possess 3-4+ years of experience in database migration, with a focus on Salesforce applications and handling sensitive data. Proficiency in using ETL tools like Informatica (PowerCenter, Informatica Cloud), Boomi, or other tools for data migration is required. Experience with Salesforce data import/export tools, SQL, ETL processes, and data integration methodologies is essential. Expertise in data migration tools and techniques, along with familiarity with Salesforce APIs and integration methods, is crucial. You will be responsible for migrating and integrating data from different platforms into Salesforce, preparing data migration plans, handling kickouts/fallouts, and developing procedures and scripts for data migration. Additionally, you will need to develop, implement, and optimize stored procedures and functions using TSQL, as well as perform SQL database partitioning and indexing procedures to handle heavy traffic loads. A solid understanding of Salesforce architecture and objects such as accounts, contacts, cases, custom objects, fields, and restrictions is necessary. Hands-on experience in data migration and integration from different platforms into Salesforce is expected. The ability to create fast and efficient database queries, including joins with several tables, and good knowledge of SQL optimization techniques are important. Experience in designing, creating, and maintaining databases, as well as familiarity with MuleSoft, Boomi, or similar integration platforms, and automating processes within Salesforce are desirable qualifications. Preferred qualifications include Salesforce Certified Administrator, Salesforce Certified Platform Developer I or II, and relevant certifications in data management, migration, or related areas.,

Posted 1 week ago

Apply

2.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced IT professional with a passion for data and technology, your role will involve ensuring that data accurately reflects business requirements and targets. Collaborating closely with the Procurement & Logistic department and external providers in an agile environment, you will leverage your deep understanding of technology stack capabilities to facilitate engagements and solve impediments for delivering data use cases to drive business value and contribute to the vision of becoming a data-driven company. You will play a crucial role in the energy transformation at Siemens Energy ABP Procurement team, working alongside a diverse team of innovative and hardworking data enthusiasts and AI professionals. Your impact will be significant, with responsibilities including service operation and end-to-end delivery management, interacting with business users and key collaborators, developing and maintaining data architecture and governance standards, designing optimized data architecture frameworks, providing guidance to developers, ensuring data quality, and collaborating with various functions to translate user requirements into technical specifications. To excel in this role, you should bring 8 to 10 years of IT experience with a focus on ETL tools and platforms, proficiency in Snowflake SQL Scripting, JavaScript, PL/SQL, and data modeling for relational databases. Experience in data warehousing, data migration, building data pipelines, and working with AWS, Azure & GCP data services is essential. Additionally, familiarity with Qlik, Power BI, and a degree in computer science or IT are preferred. Strong English skills, intercultural communication abilities, and a background in international collaboration are also key requirements. Joining the Value Center ERP team at Siemens Energy, you will be part of a dynamic group dedicated to driving digital transformation in manufacturing and contributing to the achievement of Siemens Energy's objectives. This role offers the opportunity to work on innovative projects that have a substantial impact on the business and industry, enabling you to be a part of the energy transition and the future of sustainable energy solutions. Siemens Energy is a global leader in energy technology, with a commitment to sustainability and innovation. With a diverse team of over 100,000 employees worldwide, we are dedicated to meeting the energy demands of the future in a reliable and sustainable manner. By joining Siemens Energy, you will contribute to the development of energy systems that drive the energy transition and shape the future of electricity generation. Diversity and inclusion are at the core of Siemens Energy's values, celebrating uniqueness and creativity across over 130 nationalities. The company provides employees with benefits such as Medical Insurance and Meal Card options, supporting a healthy work-life balance and overall well-being. If you are ready to make a difference in the energy sector and be part of a global team committed to sustainable energy solutions, Siemens Energy offers a rewarding and impactful career opportunity.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidates with Databricks certification are preferred. Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: - Bachelors degree in computer science or equivalent. - Minimum of 5 years of hands-on experience with Databricks. - Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). - Well-versed in data warehousing and data lake concepts. - Proficient in SQL and Python for data manipulation and analysis. - Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. - Excellent problem-solving skills. - Strong communication and collaboration skills. Preferred Qualifications: - Certified Databricks Engineer. - Experience in the life sciences domain. - Familiarity with Reltio or similar MDM (Master Data Management) tools. - Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights, and healthcare intelligence to the life sciences and healthcare industries. They create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com.,

Posted 1 week ago

Apply

5.0 - 15.0 years

0 Lacs

karnataka

On-site

The role of Talend Developer and Architect at our company involves designing, developing, testing, and deploying integration processes using Talend. Your responsibilities will include collaborating with team members to understand requirements, coding, debugging, and optimizing code for performance. You will also be involved in maintaining documentation for processes and contributing to technological improvement initiatives. As a Talend Developer and Architect, you will design and develop robust data integration solutions using Talend Studio to meet business requirements. You will also be responsible for implementing data governance frameworks and policies, configuring Talend Data Catalog, managing metadata repositories, data quality rules, data dictionaries, and optimizing data pipelines for performance and scalability. To excel in this role, you should have a background in Computer Science, proficiency in Back-End Web Development and Software Development, strong programming skills with an emphasis on Object-Oriented Programming (OOP), and experience with ETL tools, particularly Talend. Excellent analytical and problem-solving skills, along with good communication and teamwork abilities, are essential. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. You will work closely with data stewards, business analysts, data engineers, data scientists, and business stakeholders to understand and fulfill data integration requirements. If you are looking for a challenging opportunity to showcase your skills and contribute to the success of our organization, this role is perfect for you.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

The role requires you to understand the business functionalities and technical/database landscapes of the applications under i360. You will collaborate with Database Engineers and Data Analysts to comprehend the requirements and testing needs. Building and maintaining a test automation framework will be a crucial part of your responsibilities, including creating robust and maintainable test cases covering various ETL processes, database systems, and data analyses. Implementing Quality Engineering Strategies, best practices, and guidelines to ensure scalability, reusability, and maintainability is an essential aspect of the role. As part of the position, you will be expected to identify, replicate, and report defects, as well as verify defect fixes. Data accuracy, completeness, and consistency will be validated using ETL tools and SQL queries. Being proactive, adaptable to changes, and possessing strong communication skills (both verbal and written) are key attributes for success in this role. Expertise in DB-related testing and ETL testing, along with strong Python programming skills and proficiency in SQL and ETL tools like pandas and great expectations, are necessary. Knowledge of SQL and experience working with databases such as Redshift, Elasticsearch, OpenSearch, Postgres, and Snowflake is required. Additionally, familiarity with analyzing population data and demographics, version control using Gitlab, pipeline integration, and working under pressure with strong attention to detail are essential qualities for this position. The role also involves contribution motivation, good verbal and written communication skills, mentorship, knowledge sharing, experience with Jira, knowledge of Agile methodology, and hands-on experience in DevOps like Gitlab CI/CD pipeline. If you possess strong analytical, problem-solving, and troubleshooting skills and stay updated on current market trends, this position might be suitable for you. This is a Contractual/Temporary job with a Day shift schedule and an in-person work location. To apply for this position, please send your resumes to gopi@nithminds.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

This is a full-time Data Engineer position with D Square Consulting Services Pvt Ltd, based in Pan-India with a hybrid work model. You should have at least 5 years of experience and be able to join immediately. As a Data Engineer, you will be responsible for designing, building, and scaling data pipelines and backend services supporting analytics and business intelligence platforms. A strong technical foundation, Python expertise, API development experience, and familiarity with containerized CI/CD-driven workflows are essential for this role. Your key responsibilities will include designing, implementing, and optimizing data pipelines and ETL workflows using Python tools, building RESTful and/or GraphQL APIs, collaborating with cross-functional teams, containerizing data services with Docker, managing deployments with Kubernetes, developing CI/CD pipelines using GitHub Actions, ensuring code quality, and optimizing data access and transformation. The required skills and qualifications for this role include a Bachelor's or Master's degree in Computer Science or a related field, 5+ years of hands-on experience in data engineering or backend development, expert-level Python skills, experience with building APIs using frameworks like FastAPI, Graphene, or Strawberry, proficiency in Docker, Kubernetes, SQL, and data modeling, good communication skills, familiarity with data orchestration tools, experience with streaming data platforms like Kafka or Spark, knowledge of data governance, security, and observability best practices, and exposure to cloud platforms like AWS, GCP, or Azure. If you are proactive, self-driven, and possess the required technical skills, then this Data Engineer position is an exciting opportunity for you to contribute to the development of cutting-edge data solutions at D Square Consulting Services Pvt Ltd.,

Posted 1 week ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location- Remote, Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

NTT DATA is looking for a SQL Developer to join their team in Bangalore, Karnataka, India. As a SQL Developer, you will be responsible for developing complex queries, extracting data using ETL tools, and cleansing and validating both structured and unstructured data. Additionally, you will be involved in creating insurance reports, visualizations, dashboards, and conducting analysis and analytics with a focus on life insurance. The ideal candidate should have a strong proficiency in SQL along with knowledge of tools like EXL, R, and Python. NTT DATA is a global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. They serve 75% of the Fortune Global 100 and have experts in more than 50 countries. If you are a passionate individual with strong SQL skills and a background in life insurance, this is a great opportunity to be part of an inclusive and forward-thinking organization. Apply now to grow with NTT DATA and contribute to their mission of driving innovation and digital transformation.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

panchkula, haryana

On-site

We are seeking a skilled and experienced Lead/Senior ETL Engineer with 4-8 years of experience to join our dynamic data engineering team. As a Lead/Sr. ETL Engineer, you will play a crucial role in designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. Your expertise in ETL tools, cloud platforms, scripting, and data modeling principles will be pivotal in building efficient, scalable, and reliable data solutions for enterprise-level implementations. Key Skills: - Proficiency in ETL tools such as SSIS, DataStage, Informatica, or Talend. - In-depth understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, and Fact & Dimension tables. - Strong experience with relational databases like SQL Server, Oracle, Teradata, DB2, or MySQL. - Solid scripting/programming skills in Python. - Hands-on experience with cloud platforms like AWS or Azure. - Knowledge of middleware architecture and enterprise data integration strategies. - Familiarity with reporting/BI tools such as Tableau and Power BI. - Ability to write and review high and low-level design documents. - Excellent communication skills and the ability to work effectively with cross-cultural, distributed teams. Roles and Responsibilities: - Design and develop ETL workflows and data integration strategies. - Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. - Coach and mentor junior engineers to support skill development and performance. - Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. - Participate in planning, estimations, and recruitment activities. - Work on multiple projects simultaneously, ensuring quality and consistency in delivery. - Experience in Sales and Marketing data domains. - Strong problem-solving abilities with a data-driven mindset. - Ability to work independently and collaboratively in a fast-paced environment. - Prior experience in global implementations and managing multi-location teams is a plus. If you are a passionate Lead/Sr. ETL Engineer looking to make a significant impact in a dynamic environment, we encourage you to apply for this exciting opportunity. Thank you for considering a career with us. We look forward to receiving your application! For further inquiries, please contact us at careers@grazitti.com. Location: Panchkula, India,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

At Improzo, we are dedicated to improving life by empowering our customers through quality-led commercial analytical solutions. Our team of experts in commercial data, technology, and operations collaborates to shape the future and work with leading Life Sciences clients. We prioritize customer success and outcomes, embrace agility and innovation, foster respect and collaboration, and are laser-focused on quality-led execution. As a Data and Reporting Developer (Improzo Level - Associate) at Improzo, you will play a crucial role in designing, developing, and maintaining large-scale data processing systems using big data technologies. You will collaborate with data architects and stakeholders to implement data storage solutions, develop ETL pipelines, integrate various data sources, design and build reports, optimize performance, and ensure seamless data flow. Key Responsibilities: - Design, develop, and maintain scalable data pipelines and big data applications using distributed processing frameworks. - Collaborate on data architecture, storage solutions, ETL pipelines, data lakes, and data warehousing. - Integrate data sources into the big data ecosystem while maintaining data quality. - Design and build reports using tools like Power BI, Tableau, and Microstrategy. - Optimize workflows and queries for high performance and scalability. - Collaborate with cross-functional teams to deliver data solutions that meet business requirements. - Perform testing, quality assurance, and documentation of data pipelines. - Participate in agile development processes and stay up-to-date with big data technologies. Qualifications: - Bachelor's or master's degree in a quantitative field. - 1.5+ years of experience in data management or reporting projects with big data technologies. - Hands-on experience or thorough training in AWS, Azure, GCP, Databricks, and Spark. - Experience in Pharma Commercial setting or Pharma data management is advantageous. - Proficiency in Python, SQL, MDM, Tableau, PowerBI, and other tools. - Excellent communication, presentation, and interpersonal skills. - Attention to detail, quality, and client centricity. - Ability to work independently and as part of a cross-functional team. Benefits: - Competitive salary and benefits package. - Opportunity to work on cutting-edge tech projects in the life sciences industry. - Collaborative and supportive work environment. - Opportunities for professional development and growth.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies