Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Data Engineering Pipeline Development Design implement and maintain ETL processes using ADF and ADB Create and manage views in ADB and SQL for efficient data access Optimize SQL queries for large datasets and high performance Conduct end-to-end testing and impact analysis on data pipelines Optimization Performance Tuning Identify and resolve bottlenecks in data processing Optimize SQL queries and Delta Tables for fast data processing Data Sharing Integration Implement Delta Share, SQL Endpoints, and other data sharing methods Use Delta Tables for efficient data sharing and processing API Integration Development Integrate external systems through Databricks Notebooks and build scalable solutions Experience in building APIs (Good to have) Collaboration Documentation Collaborate with teams to understand requirements and design solutions Provide documentation for data processes and architectures
Posted 1 week ago
3.0 - 7.0 years
9 - 14 Lacs
Bengaluru
Work from Office
3 -4+ years of overall industry experience in designing, developing, anddeploying ETL solutions using industry standard ETL tools 1+ years of hands-onexperience in developing and productionizing solutions with Talend DataIntegration Extensive experience in designing end-to-end transformations andworkflows using Talend Data Integration, as per requirement specifications Good communications skills in spoken and written English
Posted 1 week ago
2.0 - 7.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
CirrusLabs Private Limited is looking for DWH ETL Developer to join our dynamic team and embark on a rewarding career journey Consulting with data management teams to get a big-picture idea of the companys data storage needs. Presenting the company with warehousing options based on their storage needs. Designing and coding the data warehousing system to desired company specifications. Conducting preliminary testing of the warehousing environment before data is extracted. Extracting company data and transferring it into the new warehousing environment. Testing the new storage system once all the data has been transferred. Troubleshooting any issues that may arise. Providing maintenance support.
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for ETL Test Engineer at Tata Consultancy Services. Hiring For: ETL Test Engineer Interview date: 14-June-25 In-person Drive Location: Bangalore Experience: 4-10 years Must Have: 1. SQL - Expert level of knowledge in core concepts of SQL and query. 2.Lead and mentor a team of ETL testers, providing technical guidance, training, and support in ETL tools, SQL, and test automation frameworks. 3.Create and review complex test cases, test scripts, and test data for ETL processes. 4. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 5.Execute test cases, validate data transformations, and ensure data accuracy and consistency across source and target systems 6. Experience in query optimization, stored procedures/views and functions. 7. Strong familiarity of data warehouse projects and data modeling. 8. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 9. Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, Power BI/Cognos/Informatica, etc.) 10.Develop and maintain ETL test automation frameworks to enhance testing efficiency and coverage. 11. Integrate automated tests into the CI/CD pipeline to ensure continuous validation of ETL processes. 12. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 13. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 14. Communication - Good communication skills to understand and collaborate with all the stake holders within the project If you are interested in this exciting opportunity, please share your updated resume on saranya.devi3@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: University/Institute name: Current Organization Willing to relocate Bangalore : Total Experience: Relevant Experience in (ETL Test Engineer): Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Available for In-Person interview on 14-June-25: Timings: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Note: only Eligible candidates with Relevant experience will be contacted further. Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: ETL Test Engineer Experience range: 4-10 years Location: Current location must be Bangalore ONLY NOTE: Candidate interested for a walk-in drive in Bangalore must apply Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary We are seeking an experienced Database Developer with strong expertise in Relational Database Management Systems (RDBMS), particularly Oracle writing complex stored procedures, triggers, and functions. You will work closely with cross-functional teams to design, develop, optimize, and maintain scalable and efficient database solutions. Key Responsibilities Design, develop, and implement database structures and solutions for high-performance data processing and reporting. Work with Oracle RDBMS to write and optimize complex SQL queries, stored procedures, triggers, and functions. Basic knowledge on Talend to ensure efficient data integration, transformation, and loading. Collaborate with data architects and business stakeholders to translate requirements into technical solutions. Design, implement, and maintain complex database structures, ensuring consistency, reliability, and high availability. Troubleshoot database issues, including performance, security, and availability, and take necessary corrective actions. Perform database tuning to optimize the performance of queries, indexes, and system resources. Maintain data integrity and support data security protocols in line with industry best practices. Develop and manage database migration strategies, ensuring smooth data transitions between systems. Document and standardize coding practices, procedures, and database workflows. Monitor database system performance and create reports for operational monitoring and optimization. Collaborate with software development teams to ensure that database solutions align with application architecture and system requirements. Skills and Qualifications: 6 years of hands-on experience working with RDBMS such as Oracle. Proficient in writing and optimizing SQL queries, stored procedures, triggers, and functions in Oracle. Strong experience in database design, including normalization, indexing, and partitioning for performance optimization. Experience with Oracle PL/SQL and database tuning to improve query performance. Familiarity with database replication, data migrations, and backup and recovery strategies. Understanding of data security protocols and compliance standards (e.g., GDPR, HIPAA). Ability to troubleshoot complex database issues related to performance, integrity, and security. Strong analytical and problem-solving skills, with the ability to handle complex data challenges. Excellent communication skills and the ability to work well with both technical and non-technical teams. Familiarity with database administration concepts and monitoring tools. Must Have End to end Web Analytics Implementation project activation Defining the Technical Implementation and Data layer Architecture during tag implementation Integrating other solutions like Consent Management (OneTrust), Observepoint, ETL tools (Alteryx) with the Google Analytics Platform Gathering the technical requirements from the client and creating the documentation like SDR, Tech Spec, MRD Ability to plan and implement methods to measure experiences, including Tag Management Solutions like Tealium iQ (primarily), Adobe Launch, Adobe Analytics, Dynamic Tag Manager, Ensighten, Google Tag Manager Understand and use multitude of tag managers and writing JavaScript code to realize client driven business requirements Responsible for site optimization with an ability to solution design and implement the analytics strategy and technology needed to gain and stitch together insights into both online and physical location activity Experienced in Marketing Performance analysis i.e. data aggregation (leveraging marketing & click-stream APIs, data cleaning & transformation), analysis & segmentation, targeting & integration Experienced in A/B testing, MVT/Optimization framework(s) using tools like Adobe Target Develop the strategy of enterprise level solutions as well as architecting extensible and maintainable solutions utilizing the Adobe and Google analytics platforms Excellent understanding of digital analytics specially Clickstream Data Ability to create data visualization dashboards specially on Workspace, Data Studio, MS Excel and Adobe Report builder Agile method understanding About you: Analytics Platforms - Google Analytics, Adobe Analytics/Omniture SiteCatalyst, Matomo/Piwik Tag Managers Adobe Launch/DTM, Tealium IQ, Google Tag Manager, Piwik Pro, Signal/Bright Tag Optimization Platform Adobe Target, Google Optimize, Optimizely 1 years in a client facing role for solutioning and / or evangelizing technology approaches. Programming Languages - JavaScript, jQuery Markup Languages - HTML, CSS Good to have EQUAL OPPORTUNITY Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidates merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics. Locations - Bangalore, KA, IN Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities We are seeking a skilled and experienced Cognos TM1 Developer with a strong background in ETL processes and Python development. The ideal candidate will be responsible for designing, developing, and supporting TM1 solutions, integrating data pipelines, and automating processes using Python. This role requires strong problem-solving skills, business acumen, and the ability to work collaboratively with cross-functional teams Preferred Education Master's Degree Required Technical And Professional Expertise 4+ years of hands-on experience with IBM Cognos TM1 / Planning Analytics. Strong knowledge of TI processes, rules, dimensions, cubes, and TM1 Web. Proven experience in building and managing ETL pipelines (preferably with tools like Informatica, Talend, or custom scripts). Proficiency in Python programming for automation, data processing, and system integration. Experience with REST APIs, JSON/XML data formats, and data extraction from external sources Preferred Technical And Professional Experience strong SQL knowledge and ability to work with relational databases. Familiarity with Agile methodologies and version control systems (e.g., Git). 3.Excellent analytical, problem-solving, and communication skills Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Sapiens is on the lookout for a Senior Developer to become a key player in our Bangalore team. If you're a seasoned Senior Developer pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens’ Digital (Data Suite) division, for more information about it, click here: https://sapiens.com/solutions/digitalsuite-customer-experience-and-engagement-software-for-insurers/ Job Description What you’ll do: Collaborate with business users to understand and refine ETL requirements and business rules for effective solution implementation. Design, develop, implement, and optimize ETL processes to meet business and technical needs. Troubleshoot and resolve ETL-related issues, ensuring system performance and reliability. Create and execute comprehensive unit test plans based on system and validation requirements to ensure the quality of the solutions. Provide ongoing support and consultation for the development and enhancement of technical solutions across various business functions. Primary Skills What to Have for this position: Strong understanding of advanced ETL concepts, as well as the administration activities required to support R&D and project needs. Extensive experience with ETL tools and advanced transformations, particularly Talend and Java. Ability to effectively troubleshoot and resolve complex ETL coding and administrative issues. Secondary Skills Experience in designing and developing fully interactive dashboards, including storylines, drill-down functionality, and linked visualizations. Ability to design and optimize tables, views, and DataMarts to support dynamic and efficient dashboards. Proficient in proposing and implementing data load strategies that enhance performance and improve data visualizations. Expertise in performance tuning for SQL, ETL processes and reports. Process Knowledge Experience in data validation and working with cross-functional teams (including Business Analysts and Business Users) to clarify and define business requirements. Ability to develop ETL mappings, specifications (LLDs/HLDs), and data load strategies with minimal supervision. Understanding of SDLC methodologies, including Agile, and familiarity with tools such as JIRA for project management and issue tracking. Sapiens is an equal opportunity employer. We value diversity and strive to create an inclusive work environment that embraces individuals from diverse backgrounds. Disclaimer: Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com Show more Show less
Posted 2 weeks ago
5.0 years
5 - 10 Lacs
Bengaluru
On-site
Country/Region: IN Requisition ID: 26145 Work Model: Position Type: Salary Range: Location: INDIA - BENGALURU - BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility o Job Title – Denodo Developer o No. of Open Positions - 1 o Experience- 5- 9 years o Location: Bangalore, Noida, Chennai, Mumba, Hyderabad, Pune o Shift Time - CET (12:30 to 9:30 IST) Job Description: We are seeking a highly skilled and experienced Denodo Developer with a strong background in ETL processes and deep knowledge of the Life Sciences domain. The ideal candidate will be responsible for developing data virtualization solutions, integrating complex datasets from multiple sources, and enabling real-time data access for analytics and operational reporting. This role requires close collaboration with data architects, data engineers, and business stakeholders in a regulated environment. Key Proficiency & Responsibilities: Design, develop, and optimize data virtualization solutions using Denodo Platform. Integrate structured and unstructured data sources into Denodo views and services. Develop custom views, VQL scripts, and data services (REST/SOAP). Build and optimize ETL/ELT pipelines to support data ingestion and transformation. Work closely with Life Sciences business teams to translate domain-specific requirements into data solutions. Implement data governance, security, and compliance practices adhering to GxP and FDA regulations. Provide support for data access, lineage, metadata management, and user training. Collaborate with cross-functional teams in an Agile development environment. Optimize workflows for performance and scalability. Develop and maintain data documentation, including workflow descriptions and data dictionaries. Strong knowledge of data preparation, ETL concepts, and data warehousing. Excellent analytical, problem-solving, and communication skills. Proficient in VQL, JDBC, ODBC, and web services integration. Strong expertise in ETL tools (e.g., Informatica, Talend, DataStage, or Azure Data Factory). Deep understanding of Life Sciences domain – clinical trials, regulatory data, pharmacovigilance, or research & development. Preferred Qualifications: B.Tech. or MCA from a recognized University Minimum 5+ years of relevant experience as a Denodo Developer. Strong SQL and database skills (Oracle, SQL Server, PostgreSQL, etc.). Knowledge of data modelling, data warehousing, and virtual data layers. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience working in Agile/Scrum environments. Exposure to cloud platforms such as AWS, Azure, or GCP.
Posted 2 weeks ago
3.0 - 7.0 years
0 - 1 Lacs
Pune, Ahmedabad, Bengaluru
Work from Office
Job Title: Reltio MDM Developer Location: Remote Experience Required: 2+ Years Key Responsibilities: Design, configure, and implement Reltio MDM solutions based on business and technical requirements. Develop and enhance Reltio data models including entities, attributes, relationships, and match/merge rules. Configure survivorship rules , reference data, workflows, and validation rules within the platform. Build seamless integrations between Reltio and external systems using REST APIs , ETL tools (e.g., Informatica, Talend), or middleware solutions (e.g., MuleSoft). Monitor, troubleshoot, and optimize data load and synchronization processes. Support data governance initiatives, including data quality profiling , standardization, and issue resolution. Collaborate with business stakeholders, data stewards, and analysts to refine requirements and address data integrity concerns. Ensure performance tuning and adherence to Reltio best practices for configuration and deployment. Required Skills: Minimum 2+ years of hands-on experience working with the Reltio Cloud MDM platform . Strong grasp of MDM principles , data modeling concepts , and entity relationship management . Experience configuring Reltio L3 , match/merge logic, and survivorship strategies. Proficiency with REST APIs , JSON , and XML for integration and data exchange. Working experience with integration tools like Talend , Informatica , or MuleSoft . Solid debugging and troubleshooting skills related to data quality , transformations, and API communication. Familiarity with data governance frameworks and compliance standards. Nice to Have: Experience in implementing Reltio UI configurations or custom UI components. Exposure to data analytics and reporting tools. Knowledge of cloud platforms (e.g., AWS, Azure) for hosting or extending MDM functionality. Familiarity with Agile methodologies and tools like JIRA or Confluence .
Posted 2 weeks ago
8.0 - 12.0 years
15 - 27 Lacs
Mumbai, Pune, Bengaluru
Work from Office
Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager
Posted 2 weeks ago
8.0 - 13.0 years
22 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Location: Bengaluru, Hyderabad, Chennai & Pune ETL Development Lead : Having prior Lead exp is must to have (minimum 1 year) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions
Posted 2 weeks ago
5.0 - 8.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2335_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Snowflake Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) Snowflake knowledge (Must have) Autonomous person SQL Knowledge (Must have) Data modeling (Must have) Datawarehouse concepts and DW design best practices (Must have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Informatica IDMC (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 weeks ago
6.0 - 10.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2384_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake DBA City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 weeks ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Description : ETL Development Lead (10+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform.
Posted 2 weeks ago
6.0 - 10.0 years
12 - 15 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Skill/Operating Group Technology Consulting Level Manager Location Gurgaon/Mumbai/Bangalore Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise"strategic, industry, functional, technical"in a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Identifying, assessing, and solving complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Implementing programs/interventions that prepare the organization for implementation of new business processes Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients Deep understanding of industry best practices in data governance and management Provide thought leadership to the downstream teams for developing offerings and assets Qualifications Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 6-10 years of large-scale consulting experience and/or working with hi tech companies in data governance, and data management. Certified on DAMA (Data Management) Experience: We are looking for experienced professionals with information strategy, data governance, data quality, data management, and MDM experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. MDM / DQ/ DG Architect - Data Governance & Management SME for areas including Data Quality, MDM, Metadata, data lineage, data catalog. Experience one or more technologies in this space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Enphase Energy is a global energy technology company and leading provider of solar, battery, and electric vehicle charging products. Founded in 2006, Enphase transformed the solar industry with our revolutionary microinverter technology, which turns sunlight into a safe, reliable, resilient, and scalable source of energy to power our lives. Today, the Enphase Energy System helps people make, use, save, and sell their own power. Enphase is also one of the fastest growing and innovative clean energy companies in the world, with approximately 68 million products installed across more than 145 countries. We are building teams that are designing, developing, and manufacturing next-generation energy technologies and our work environment is fast-paced, fun and full of exciting new projects. If you are passionate about advancing a more sustainable future, this is the perfect time to join Enphase! About the role : The Enphase ‘Analyst – Procurement’ will get involved in Claims Process, Component Capacity and Inventory Analysis, Supplier Risk Assessments and Other Procurement related Analytics. This role is to understand existing process in detail and implement RPA model wherever it is applicable. Perform market research on latest process and procedures available with respect to procurement function and automate/Digitize the process. A highly Challenging Job role where you need to Interact with many stake holders and to solve operational issues. You will be part of the Global Sourcing & Procurement team reporting to Lead Analyst. What you will do : Perform detailed analysis on Component Inventory against the Demand, On-Hand & Open Order Qtys: Use advanced data analytics tools like Power BI or Tableau to visualize inventory data. Implement predictive analytics to forecast demand more accurately. Automate the input data consolidation from different Contract Manufacturers: Use ETL (Extract, Transform, Load) tools like Alteryx or Talend to automate data consolidation. Implement APIs to directly pull data from manufacturers' systems. Prepare and submit a monthly STD cost file to finance as per the corporate calendar timelines: Create a standardized template and automate data entry using Excel macros or Python scripts. Set up reminders and workflows in project management tool to ensure timely submission. Work as a program manager by driving component Qualification process working with cross functional teams to get the Qualification completed on time to achieve planned cost savings: Use project management software like Jira to track progress and deadlines. Regularly hold cross-functional team meetings to ensure alignment and address any roadblocks. Finalize the quarterly CBOM (Costed Bill of Materials) and Quote files from all contract manufacturers by following the CBOM calendar timelines: Implement a centralized database to store and manage CBOM data. Use version control systems to track changes and ensure accuracy. Managing Claims management process with Contract Manufacturers and Suppliers: Develop a standardized claims validation process and effectively track & manage claims. Regularly review and update the claims process to improve efficiency. Do market research on new processes & Best Practices on procurement and see how it can be leveraged in the existing process Perform and maintain detailed analysis on Supplier risk assessment with the help of 3rd party vendors: Regularly review and update risk assessment criteria based on changing market conditions. Compile and perform Supplier pricing trend analysis to support Commodity Managers for their QBRs: Create dashboards in BI tools to visualize pricing trends and support decision-making. Work closely with Commodity Managers and identify the Potential or NPI Suppliers to be evaluated for risk assessments: Maintain a database of potential suppliers and their risk assessment results. Maintain & Manage Item master pricing list by refreshing the data on regular intervals without any errors: Use data validation techniques and automated scripts to ensure data accuracy. Implement a regular review process to update and verify pricing data. Who you are and what you bring : Any Bachelor's degree, preferred in Engineering, with minimum 5+ years of experience in Supply Chain Analytics. Should have very good Analytical & Problem-Solving skills. Should have hands on experience on excel based Automations, using MS Power Query, Excel VBA & Gen AI. Should be open minded and should take ownership. Should have strong Verbal Communication and Presentation skills. Strong professional relationship management with internal and external interfaces. Strong interpersonal skills with proven ability to communicate effectively both verbally and in writing with internal customers and suppliers. Ability to perform effectively and independently in a virtual environment. Ability to effectively manage job responsibilities with minimal supervision Show more Show less
Posted 2 weeks ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 1-3 + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 1- 3 + years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
15 - 18 Lacs
Navi Mumbai, Bengaluru, Mumbai (All Areas)
Work from Office
Greetings!!! This is in regards to a Job opportunity for ETL Developer with Datamatics Global Services Ltd. Position: ETL Developer Website: https://www.datamatics.com/ Job Location: Mumbai/Bangalore Job Description: 5 years experience Minimum 3 years of experience in Talend & Datastage development Expertise in designing and implementing Talend & Datastage ETL jobs Strong analytical and problem-solving skills Design, develop, and maintain Talend integration solutions Collaborate with business stakeholders and IT teams to gather requirements and recommend solutions Create and maintain technical documentation Perform unit testing and troubleshoot issues
Posted 2 weeks ago
5.0 - 9.0 years
7 - 17 Lacs
Pune, Chennai, Bengaluru
Hybrid
Job Title : Talend Developer Location : Pune, Chennai & Bangalore Job Summary : We are seeking a skilled Talend Developer to join our team. The ideal candidate will have a strong background in data integration and ETL processes, with expertise in Talend tools. You will be responsible for designing, developing, and maintaining data integration solutions to support our business needs. Key Responsibilities : Design and develop ETL processes using Talend to integrate data from various sources. Implement data transformation and cleansing using Talend components such as tMap, tJoin, and others. Manage input and output components for files and databases, ensuring seamless data flow. Develop error handling mechanisms to ensure data integrity and reliability. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize Talend jobs for performance and scalability. Document ETL processes and maintain technical documentation. Must-Have Skills : Proficiency in Talend Data Integration tools. Experience with input/output components for files and databases. Strong knowledge of transformation components like tMap and tJoin. Expertise in error handling within Talend jobs. Familiarity with Talend's best practices and performance optimization techniques.
Posted 2 weeks ago
4.0 - 9.0 years
20 - 25 Lacs
Bengaluru
Work from Office
We are seeking a skilled and experienced Cognos TM1 Developer with a strong background in ETL processes and Python development. The ideal candidate will be responsible for designing, developing, and supporting TM1 solutions, integrating data pipelines, and automating processes using Python. This role requires strong problem-solving skills, business acumen, and the ability to work collaboratively with cross-functional teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of hands-on experience with IBM Cognos TM1 / Planning Analytics. Strong knowledge of TI processes, rules, dimensions, cubes, and TM1 Web. Proven experience in building and managing ETL pipelines (preferably with tools like Informatica, Talend, or custom scripts). Proficiency in Python programming for automation, data processing, and system integration. Experience with REST APIs, JSON/XML data formats, and data extraction from external sources Preferred technical and professional experience strong SQL knowledge and ability to work with relational databases. Familiarity with Agile methodologies and version control systems (e.g., Git). 3.Excellent analytical, problem-solving, and communication skills
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
hackajob is collaborating with OneAdvanced to connect them with exceptional tech professionals for this role. Senior Data Integration Engineer IN-KA-Bengaluru Role Introduction We are seeking a Data Integration Specialist who will be responsible for ensuring seamless data flow between various systems, using several integration toolsets, managing data integration processes, and maintaining data quality and accessibility. You will work closely with our Data Analyst and report to the Data Eco-System Leader. This is a new role in a developing team. You will be in on the Ground Floor so will help to shape how we mature in this space. What You Will Do Key Responsibilities Design and Develop Data Integration Solutions: Create and implement data integration processes using ETL (Extract, Transform, Load) tools to consolidate data from various sources into cohesive data models. Build integration scripts and flows: As defined by the Business Stakeholders build and or change Integrations already developed within the business. Data Quality Management: Conduct data quality assessments and implement measures to enhance data accuracy and integrity. Operationalise the data exceptions and reporting around the integration of data. Collaboration: Work closely within and across the functional teams to gather requirements and understand diverse data sources, ensuring that integration strategies align with business objectives. Monitoring and Troubleshooting: Oversee data integration workflows, resolve issues, and optimize performance to ensure reliable data flow. Documentation: Maintain comprehensive documentation of data integration processes, data flows, and system configurations. Stay Updated: Keep abreast of industry trends and best practices in data integration and management What You Will Have Technical Skills & Qualifications Delivery focus: you will have led a team or been the deputy manager for a delivery focused team, preferably cross discipline. Technical Expertise: Extensive knowledge of data integration tools and languages, such as Dell Boomi, Rest API, Microsoft Fabric, Integration Hub, SQL, ETL, and XML. Problem-Solving Skills: Strong analytical skills to interpret complex data and troubleshoot integration issues effectively. Communication Skills: Effective communication skills to liaise with multiple technical and business teams and explain complex data issues clearly. Experience: Proven experience as a Data Integration Specialist or a similar role, with hands-on experience using ETL tools like Talend, Informatica, or Apache Nifi. Education: A bachelor's degree in a related field such as Computer Science, Information Technology, or Engineering is typically required or proven experience in Data Mining, ETL and Data Analysis. Would be really good to have Tools: Experience with Boomi, Rest API, ServiceNow Integration Hub, JIRA and ITSM platforms is beneficial. Scripting: understanding and ability to design and script workflows and automations. Enterprise Systems: An understanding of data structures in Salesforce. What We Do For You Wellbeing focused - Our people are our greatest assets, and ensuring everyone feels their best self to come to work is integral. Annual Leave - 20 days of annual leave, plus public holidays Employee Assistance Programme - Free advice, support, and confidential counselling available 24/7. Personal Growth - We’re committed to enabling your growth personally and professionally through development programmes. Life Insurance - 2x annual salary Personal Accident Insurance - providing cover in the event of serious injury/illness. Performance Bonus - Our Group-wide bonus scheme enables you to reap the rewards of your success. Who We Are OneAdvanced is one UK's largest providers of business software and services serving 20,000+ global customers with an annual turnover of £330M+. We manage 1.5 million 111 calls per month, support over 2 million Further Education learners across the UK, handle over 10 million wills, and so much more. Our mission is to power the world of work and, as you can see, our software underpins some of the UK's most critical sectors. We invest in our brilliant people. They are at the heart of our success as we strive to be a diverse, inclusive and engaging place to work that not only powers the world of work, but empowers the growth, ambitions and talent of our people. Show more Show less
Posted 3 weeks ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives. DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills. We are seeking committed and talented MDM Engineers to join our new FoundationX team, which lies at the heart of DigitalX. As a member within FoundationX, you will be playing a critical role in ensuring our MDM systems are operational, scalable and continue to contain the right data to drive business value. You will play a pivotal role in building maintaining and enhancing our MDM systems. Purpose And Scope As a Junior Data Engineer, you will play a crucial role in assisting in the design, build, and maintenance of our data infrastructure focusing on BI and DWH capabilities. Working with the Senior Data Engineer, your foundational expertise in BI, Databricks, PySpark, SQL, Talend and other related technologies, will be instrumental in driving data-driven decision-making across the organization. You will play a pivotal role in building maintaining and enhancing our systems across the organization. This is a fantastic global opportunity to use your proven agile delivery skills across a diverse range of initiatives, utilize your development skills, and contribute to the continuous improvement/delivery of critical IT solutions. This position is based in Bengaluru and will require some on-site work. Essential Job Responsibilities Collaborate with FoundationX Engineers to design and maintain scalable data systems. Assist in building robust infrastructure using technologies like PowerBI, Qlik or alternative, Databricks, PySpark, and SQL. Contribute to ensuring system reliability by incorporating accurate business-driving data. Gain experience in BI engineering through hands-on projects. Data Modelling and Integration: Collaborate with cross-functional teams to analyze requirements and create technical designs, data models, and migration strategies. Design, build, and maintain physical databases, dimensional data models, and ETL processes specific to pharmaceutical data. Cloud Expertise Evaluate and influence the selection of cloud-based technologies such as Azure, AWS, or Google Cloud. Implement data warehousing solutions in a cloud environment, ensuring scalability and security. BI Expertise Leverage and create PowerBI, Qlik or equivalent technology for data visualization, dashboards, and self-service analytics. Data Pipeline Development Design, build, and optimize data pipelines using Databricks and PySpark. Ensure data quality, reliability, and scalability. Application Transition: Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition. Mentorship and Leadership: Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning. Data Strategy Contribution: Contribute to the organization’s data strategy by identifying opportunities for data-driven insights and improvements. Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond. Design, develop and implement robust and scalable data analytics using modern technologies. Collaborate with cross functional teams and practices across the organization including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions. Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible. Champion continuous improvement initiatives identifying opportunities to optimize performance security and maintainability of existing data and platform architecture and other technology investments. Participate in the continuous delivery pipeline. Adhering to DevOps best practices for version control automation and deployment. Ensuring effective management of the FoundationX backlog. Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization. Stay-up to date on the latest trends and technologies in data engineering and cloud platforms. Qualifications Required Bachelor's degree in computer science, Information Technology, or related field (master’s preferred) or equivalent experience 1-3+ years of experience in data engineering with a strong understanding of BI technologies, PySpark and SQL, building data pipelines and optimization. 1-3 +years + experience in data engineering and integration tools (e.g., Databricks, Change Data Capture) 1-3+ years + experience of utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding/certification of AWS and Azure is considered a plus. Experience with relational and non-relational databases. Any relevant cloud-based integration certification at foundational level or above. (Any QLIK or BI certification, AWS certified DevOps engineer, AWS Certified Developer, Any Microsoft Certified Azure qualification, Proficient in RESTful APIs, AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP or any relevant certification) Experience in MuleSoft (Anypoint platform, its components, Designing and managing API-led connectivity solutions). Experience in AWS (environment, services and tools), developing code in at least one high level programming language. Experience with continuous integration and continuous delivery (CI/CD) methodologies and tools Experience with Azure services related to computing, networking, storage, and security Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management Preferred Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement. Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 3 weeks ago
2.0 - 4.0 years
3 - 7 Lacs
Bengaluru
Work from Office
There is a need for a resource (proficient) for a Data Engineer with experience monitoring and fixing jobs for data pipelines written in Azure data Factory and Python Design and implement data models for Snowflake to support analytical solutions. Develop ETL processes to integrate data from various sources into Snowflake. Optimize data storage and query performance in Snowflake. Collaborate with cross-functional teams to gather requirements and deliver scalable data solutions. Monitor and maintain Snowflake environments, ensuring optimal performance and data security. Create documentation for data architecture, processes, and best practices. Provide support and training for teams utilizing Snowflake services. Roles and Responsibilities Strong experience with Snowflake architecture and data warehousing concepts. Proficiency in SQL for data querying and manipulation. Familiarity with ETL tools such as Talend, Informatica, or Apache NiFi. Experience with data modeling techniques and tools. Knowledge of cloud platforms, specifically AWS, Azure, or Google Cloud. Understanding of data governance and compliance requirements. Excellent analytical and problem-solving skills. Strong communication and collaboration skills to work effectively within a team. Experience with Python or Java for data pipeline development is a plus.
Posted 3 weeks ago
2.0 - 5.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Sapiens is on the lookout for a Senior Developer to become a key player in our Bangalore team. If you're a seasoned Senior Developer pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Job Description: What youll do: Collaborate with business users to understand and refine ETL requirements and business rules for effective solution implementation. Design, develop, implement, and optimize ETL processes to meet business and technical needs. Troubleshoot and resolve ETL-related issues, ensuring system performance and reliability. Create and execute comprehensive unit test plans based on system and validation requirements to ensure the quality of the solutions. Provide ongoing support and consultation for the development and enhancement of technical solutions across various business functions. What to Have for this position: Primary skills: Strong understanding of advanced ETL concepts, as well as the administration activities required to support R&D and project needs. Extensive experience with ETL tools and advanced transformations, particularly Talend and Java. Ability to effectively troubleshoot and resolve complex ETL coding and administrative issues. Secondary skills Experience in designing and developing fully interactive dashboards, including storylines, drill-down functionality, and linked visualizations. Ability to design and optimize tables, views, and DataMarts to support dynamic and efficient dashboards. Proficient in proposing and implementing data load strategies that enhance performance and improve data visualizations. Expertise in performance tuning for SQL, ETL processes and reports. Process Knowledge: Experience in data validation and working with cross-functional teams (including Business Analysts and Business Users) to clarify and define business requirements. Ability to develop ETL mappings, specifications (LLDs/HLDs), and data load strategies with minimal supervision. Understanding of SDLC methodologies, including Agile, and familiarity with tools such as JIRA for project management and issue tracking.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Are you a job seeker looking to dive into the world of data integration and management? Bengaluru, also known as the Silicon Valley of India, offers a plethora of opportunities for talend professionals. With a booming IT sector and a high demand for skilled data engineers, Bengaluru is a hotspot for talend jobs.
Bengaluru offers a comparatively lower cost of living compared to other major cities in India, making it an attractive destination for job seekers. Affordable housing options and a vibrant social scene make it an ideal city to kickstart your career in talend.
In the wake of the COVID-19 pandemic, many companies in Bengaluru are offering remote work options for talend professionals. This flexibility allows you to work from the comfort of your home while still enjoying the benefits of a thriving job market.
Bengaluru boasts a well-connected public transportation system, including buses, metro, and cabs, making it easy for job seekers to commute to their workplaces.
As technology continues to evolve, talend professionals in Bengaluru can expect to see an increase in demand for their skills. Emerging trends like cloud-based data integration and AI-driven analytics are shaping the future job market for talend experts.
If you are looking to embark on a rewarding career in data integration and management, Bengaluru is the place to be. Don't miss out on the exciting talend jobs in Bengaluru – apply now and take your career to new heights!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16951 Jobs | Dublin
Wipro
9154 Jobs | Bengaluru
EY
7414 Jobs | London
Amazon
5846 Jobs | Seattle,WA
Uplers
5736 Jobs | Ahmedabad
IBM
5617 Jobs | Armonk
Oracle
5448 Jobs | Redwood City
Accenture in India
5221 Jobs | Dublin 2
Capgemini
3420 Jobs | Paris,France
Tata Consultancy Services
3151 Jobs | Thane