Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Roles & Responsibilities At least 4 years of professional database development (SQL Server or Azure SQL Database) and ETL using Talend and SSIS in an OLAP/OLTP environments Bachelor’s or master’s degree in computer science or equivalent preferred Experience in continuous delivery environments Experience with Agile (SCRUM/Kanban) software development methodologies. Automated testing and deployment implementation a plus. Experience deploying and managing in-house/cloud-hosted data solutions Experience with large scale systems involving reporting, transactional systems and integration with other enterprise systems. Experience with Source/Version control systems. Successful history of working with high performing technology teams Technical Skills Proficiency with multiple ETL tools including Talend and SSIS (Azure Data Factory is bonus) Proficiency in SQL Development in Microsoft SQL Server (Azure SQL Databases is bonus) Experience with SQL Query Performance Optimization Experience with industry development standards and their implementation Proficiency in system analysis and design Analysis and verification technical requirements for completeness, consistency, feasibility, and testability. Identification and Resolution of technical issues through unit testing, debugging & investigation Version Control including branching and merging using services like GitHub or AzureDevOps Experience 4.5-6 Years Skills Primary Skill: SQL Development Sub Skill(s): SQL Development Additional Skill(s): SQL Development, Oracle PL/SQL Development, postgreSQL Development, ETL, SQL, Talend About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.
Posted 3 weeks ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: The position is based in India and will require the candidate to report directly to the team lead Developing projects from detailed business requirements, working through solutions and managing execution and rollout of these solutions, in the context of an overall consistent global platform Create T SQL queries/stored Procedures/Functions/Triggers using SQL Server 2014 and 2017 Understand basic data warehousing concepts. Design/Develop SSIS packages to pull data from various source systems and load to target tables. May be required to develop Dashboards and Reports using SSRS Work on BAU JIRAs and also do some levels of L3 support related activities whenever required. Providing detailed analysis and documentation of processes and flows where necessary. Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging The candidate should have the ability to operate with a limited level of direct supervision exercising independence of judgement and autonomy. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 4-8 Yrs of overall IT experience with 2+ years in Financial Services industry. Strong understanding of Microsoft SQL Server, SSIS, SSRS, Autosys. 2+ yrs. experience in any ETL tool, preferably SSIS Some knowledge of Python can be a differentiator. Highly motivated, should need minimal hand holding with ability to multitask and work under pressure Strong analytical and problem solving skills; ability to analyze data for trends and quality checking Good to have: Talend, Github Knowledge Strong knowledge of database fundamentals and advanced concepts, ability to write efficient SQL, tune existing SQL Experience with a reporting tool (e.g. SSRS, QlikView) is a plus Experience with a job scheduling tool (e.g. Autosys) Experience in Finance Industry is desired Experience with all phases of Software Development Life Cycle Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills Microsoft SQL Server, Microsoft SQL Server Programming, MS SQL Queries. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 weeks ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties And Responsibilities Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools : Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies : Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience : Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge : Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages : Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills : Strong understanding of relational databases and NoSQL databases. Version Control : Experience with version control systems (e.g., Git). Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. Communication : Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. (ref:hirist.tech)
Posted 3 weeks ago
7.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Key Responsibilities Lead and mentor a high-performing data pod composed of data engineers, data analysts, and BI developers. Design, implement, and maintain ETL pipelines and data workflows to support real-time and batch processing. Architect and optimize data warehouses for scale, performance, and security. Perform advanced data analysis and modeling to extract insights and support business decisions. Lead data science initiatives including predictive modeling, NLP, and statistical analysis. Manage and tune relational and non-relational databases (SQL, NoSQL) for availability and performance. Develop Power BI dashboards and reports for stakeholders across departments. Ensure data quality, integrity, and compliance with data governance and security standards. Work with cross-functional teams (product, marketing, ops) to turn data into strategy. Qualifications : : PhD in Data Science, Computer Science, Engineering, Mathematics, or related field. 7+ years of hands-on experience across data engineering, data science, analysis, and database administration. Strong experience with ETL tools (e.g., Airflow, Talend, SSIS) and data warehouses (e.g., Snowflake, Redshift, BigQuery). Proficient in SQL, Python, and Power BI. Familiarity with modern cloud data platforms (AWS/GCP/Azure). Strong understanding of data modeling, data governance, and MLOps practices. Exceptional ability to translate business needs into scalable data solutions. (ref:hirist.tech)
Posted 3 weeks ago
3.0 - 6.0 years
8 - 10 Lacs
Bengaluru
Hybrid
Description: Role: Data Engineer/ETL Developer - Talend/Power BI Job Description: 1. Study, analyze and understand business requirements in context to business intelligence and provide the end-to-end solutions. 2. Design and Implement ETL pipelines with data quality and integrity across platforms like Talend Enterprise, informatica 3. Load the data from heterogeneous sources like Oracle, MSSql, File system, FTP services, Rest APIs etc.. 4. Design and map data models to shift raw data into meaningful insights and build data catalog. 5. Develop strong data documentation about algorithms, parameters, models. 6. Analyze previous and present data for better decision making. 7. Make essential technical changes to improvise present business intelligence systems. 8. Optimizing ETL processes for improved performance, monitoring ETL jobs and troubleshooting issues. 9. Lead and oversee the Team deliverables, ensure best practices are followed for development. 10. Participate/lead in requirements gathering and analysis. Required Skillset and Experience: 1. Over all up to 3 years of working experience, preferably in SQL, ETL (Talend) 2. Must have 1+ years of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, TAC etc. 3. Must have understanding of database design, data modeling 4. Hands-on experience in any of the coding language (Java or Python etc.) Secondary Skillset/Good to have: 1. Experience in BI Tool like MS Power Bi. 2. Utilize Power BI to build interactive and visually appealing dashboards and reports. Required Personal & Interpersonal Skills • Strong Analytical skills • Good communication skills, both written and verbal. • Highly motivated and result-oriented • Self-driven independent work ethics that drives internal and external accountability • Ability to interpret instructions to executives and technical resources. • Advanced problem-solving skills dealing with complex distributed applications. • Experience of working in multicultural environment. Enable Skills-Based Hiring No interested candidates reach out to akram.m@acesoftlabs.com 6387195529
Posted 3 weeks ago
7.0 - 12.0 years
22 - 37 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Mumbai & Bengaluru - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 710 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation
Posted 3 weeks ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: The position is based in India and will require the candidate to report directly to the team lead Developing projects from detailed business requirements, working through solutions and managing execution and rollout of these solutions, in the context of an overall consistent global platform Create T SQL queries/stored Procedures/Functions/Triggers using SQL Server 2014 and 2017 Understand basic data warehousing concepts. Design/Develop SSIS packages to pull data from various source systems and load to target tables. May be required to develop Dashboards and Reports using SSRS Work on BAU JIRAs and also do some levels of L3 support related activities whenever required. Providing detailed analysis and documentation of processes and flows where necessary. Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging The candidate should have the ability to operate with a limited level of direct supervision exercising independence of judgement and autonomy. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 4-* Yrs. of overall IT experience with 2+ years in Financial Services industry. Strong understanding of Microsoft SQL Server, SSIS, SSRS, Autosys. 2+ yrs. experience in any ETL tool, preferably SSIS Some knowledge of Python can be a differentiator. Highly motivated, should need minimal hand holding with ability to multitask and work under pressure Strong analytical and problem solving skills; ability to analyze data for trends and quality checking Good to have: Talend, Github Knowledge Strong knowledge of database fundamentals and advanced concepts, ability to write efficient SQL, tune existing SQL Experience with a reporting tool (e.g. SSRS, QlikView) is a plus Experience with a job scheduling tool (e.g. Autosys) Experience in Finance Industry is desired Experience with all phases of Software Development Life Cycle Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Hyderabad
Work from Office
Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.
Posted 4 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Talend DI. Experience5-8 Years.
Posted 4 weeks ago
5.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Remote
Major Purpose Act as an ETL engineer on agile service delivery teams of 5-10 engineers. The main responsibilities include providing the overall development, testing and operational support and maintenance of ETL jobs and My SQL database in cloud platform. The ETL engineer will work closely with lead Architect and other stakeholders to ensure timely delivery of the features. Duties / Responsibilities: Responsible for development of Data Integration and Data migration. Develop and maintain ETL (Talend) jobs and proficient in DDL /DML (MYSQL database). Responsible for load testing, Performance tuning. Production support activities as needed. Collaborate with other team members, product owners, Service Delivery and QE Managers to provide optimal solutions, remove bottlenecks and implement agile best practices. Strong analytical skills to help development teams and infrastructure teams to determine the scaling limits and auto-scaling requirements for services, apps, and infrastructure scaling. Strong knowledge of determining the bottlenecks in different area levels like UI, backend, database, messaging services like solace, apigee, server side Ensure optimal process in the team’s ability to prioritize demand, efficient assignment of work, and timeline commitments are met Champion the technical best practices Actively participate in the CI and CD process. Our job descriptions evolve with our business needs and priorities. Your role may have additional responsibilities to ensure we adapt to changing business demands Education/Certification Requirements: Bachelor of Science, Math, Engineering, Computer Science Roles and Responsibilities Background & Experience Required: 5+ years of strong ETL experience with minimum 3 years of Talend. Experience in SOAP/REST web service implementation in Talend. Good understanding of databases – ORACLE RDBMS or MySQL or SQL Server. Experience in developing transformations with file formats like JSON, XML, CSV and flat files. Understanding of Event Driven Micro services Architecture (MSA). Understanding of messaging platforms such as Solace, ActiveMQ etc. Good knowledge of deployment methodologies and best practices Experience in working with Agile development methodologies. Strong presentation and communication skills (written and verbal). Strong analytical, problem solving and critical thinking. Ability to work well in a fast-paced environment under deadlines in a changing environment. Ability to work as a team member, as well as independently Must be organized and detail oriented. Proficient in using Git, Jira, Confluence. Experience in interacting with business analysts, developers, and diverse set of teams
Posted 4 weeks ago
2.0 - 5.0 years
2 - 6 Lacs
Bengaluru
Work from Office
About Company Kinara Capital is a FinTech NBFC dedicated to driving Financial Inclusion in the MSME sector. Our mission is to transform lives, livelihoods, and local economies by providing fast and flexible loans without property collateral to small business entrepreneurs. Led by a women-majority management team, Kinara Capital values diversity and inclusion and fosters a collaborative working environment. Kinara Capital is the only company from India recognized globally by the World Bank/IFC with a gold award in 2019 as ‘Bank of the Year-Asia’ for our innovative work in SME financing. Kinara Capital is an RBI-registered Systemically Important NBFC. Headquartered in Bangalore, we have 127 branches across Karnataka, Gujarat, Maharashtra, Andhra Pradesh, Telangana, Tamil Nadu, and UT Puducherry with more than 1000 employees. https://kinaracapital.com/ Job Title: BI Engineer Department : Data Science & BI Report To: BI Specialist - Assistant Functional Manager Purpose of Job: To lead a team through entire analytical and BI life cycle of BI analysts to build and deploy dashboards, create automation solutions (such as reports) to infuse core business functions with deep analytical insights. Job Responsibilities: Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions. Performing and documenting data analysis, data validation, and data mapping/design. Coordinate with customers and Business Analysts to determine business data reporting and analysis requirements. Assist in business report generation to internal and external customers for business decision making. Support Database Administrators and Developers to build data warehousing systems for business intelligence, reporting and analytics. Assist in data administration, modelling and integration activities in data warehouse systems. Implement business intelligence solutions to achieve data reporting and analysis goals. Support the development of business intelligence standards to meet business goals. Coordinate with the business units to identify new data requirements, analysis strategies and reporting mechanisms. Assist in design and generation of reports in timely and accurate manner. Qualifications: Education: B.Tech/B.E. Work Experience: 3+ years as an analytical professional role with 3+ years of BI experience with tools like Talend, Tableau Age: Aged below 35yrs. Other Requirements: Domain knowledge in Financial Services or Marketing is a big plus. Adept at the use of BI reporting tools such as Tableau or Metabase. Understanding data modelling, data schemas (MySQL, Aurora, Snowflake, etc. Understanding database operations and optimization for SQL. Understanding data and query optimization, query profiling, and query performance monitoring tools and techniques. Creating and maintaining business requirements and other technical documentation. Knowledge of ETL, AWS, DBT and GIT will be added advantage. Skills & Competencies Skills Technical Skills Aptitude in Math and Stats Proven experience in usage of SQL, Talend ETL, and Tableau Comfortable with programming in Python and basic statistics Soft Skills Deep Curiosity and Humility Excellent storyteller and communicator Competencies High social responsibility and mission-driven culture Team player High Integrity and focus on Building relationships Inspiring Leader who instills confidence People-oriented and Goal-oriented Initiative Energetic Bias for Action and a commitment to our social mission Place of work: Head office, Bangalore. Job Type: Full Time No. of Posts:
Posted 4 weeks ago
0 years
4 - 6 Lacs
Chennai
On-site
Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications: Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 4 weeks ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Who we are looking for: Contribute to the enhancement and maintenance of the cloud native Vendor Analytics next generation workspace . Provide engineering troubleshooting assistance to customer support teams and other development teams within Charles River. We are looking for a back-end, cloud native engineer to help build out a workspace with a cloud native, Java Spring Boot back end deployed to Kubernetes. What You Will Be Responsible For Work under minimal supervision to analyze, design, develop, test, and debug medium to large software enhancements and solutions within Charles River’s business and technical problem domains Collaborate with Business Analysts and Product Managers to turn moderately complex business requirements into working and sustainable software Provide thought leadership in the design of product architecture within the team’s scope of responsibility Develop, test, debug, and implement software programs, applications and projects using Java, SQL, JavaScript or other related software engineering languages Provide informed guidance and direction in code reviews Write unit and automation tests to ensure a high-quality end product Assist in improving development test methodologies and contribute to related test methodology frameworks Conduct manual tests to ensure a high quality end product Contribute to written design and API documentation, and participate in customer documentation process Actively participate in the agile software development process by adhering to the CRD Scrum methodology including attending all daily standups, sprint planning, backlog grooming, and retrospectives Participate in cross-team group activities to complete assignments Provide mentoring to junior staff Qualifications Education: B.Tech. degree (or foreign education equivalent) in Computer Science, Engineering, Mathematics, and Physics or other technical course of study required. M.Tech degree strongly preferred. Experience 12+ years of progressively responsible professional software engineering experience preferably in a financial services product delivery setting Experience developing enterprise software deployed to one of the major cloud providers (Azure, AWS, Google) is essential. Experience with Java SpringBoot development in cloud native applications is mandatory. Nice to have experience with ETL tools Talend/Kettle Experience with GitHub is helpful Experience with REST and PostMan is helpful Experience with Kubernetes and Kafka is preferred. 4 to 7 years of experience using SQL including DDL and DML. Experience with Snowflake is a plus. 4 to 7 years of experience in financial services developing solutions for Portfolio Management, Trading, Compliance, Post-Trade, IBOR or Wealth Management is strongly desired Authoritative experience with object-oriented programming, compiler or interpreter technologies, embedded systems, operating systems, relational databases (RDBMS), scripting and new/advanced programming languages Able to contribute to complex design specs in consultation with senior staff Able to work on medium to large projects with no supervision and on more complex tasks with minimal oversight Excellent written and verbal communication skills Able to work well with peers in a collaborative team environment A minimum of 5 years working with an Agile development methodology strongly desired Job ID: R-774654
Posted 4 weeks ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Who We Are Looking For Contribute to the enhancement and maintenance of the cloud native Vendor Analytics next generation workspace . Provide engineering troubleshooting assistance to customer support teams and other development teams within Charles River. We are looking for a back-end, cloud native engineer to help build out a workspace with a cloud native, Java SpringBoot back end deployed to Kubernetes. What You Will Be Responsible For Work under minimal supervision to analyze, design, develop, test, and debug medium to large software enhancements and solutions within Charles River’s business and technical problem domains Collaborate with Business Analysts and Product Managers to turn moderately complex business requirements into working and sustainable software Provide thought leadership in the design of product architecture within the team’s scope of responsibility Develop, test, debug, and implement software programs, applications and projects using Java, SQL, JavaScript or other related software engineering languages Provide informed guidance and direction in code reviews Write unit and automation tests to ensure a high-quality end product Assist in improving development test methodologies and contribute to related test methodology frameworks Conduct manual tests to ensure a high quality end product Contribute to written design and API documentation, and participate in customer documentation process Actively participate in the agile software development process by adhering to the CRD Scrum methodology including attending all daily standups, sprint planning, backlog grooming, and retrospectives Participate in cross-team group activities to complete assignments Provide mentoring to junior staff Qualifications Education: B.Tech. degree (or foreign education equivalent) in Computer Science, Engineering, Mathematics, and Physics or other technical course of study required. M.Tech degree strongly preferred. Experience 12+ years of progressively responsible professional software engineering experience preferably in a financial services product delivery setting Experience developing enterprise software deployed to one of the major cloud providers (Azure, AWS, Google) is essential. Experience with Java SpringBoot development in cloud native applications is mandatory. Nice to have experience with ETL tools Talend/Kettle Experience with GitHub is helpful Experience with REST and PostMan is helpful Experience with Kubernetes and Kafka is preferred. 4 to 7 years of experience using SQL including DDL and DML. Experience with Snowflake is a plus. 4 to 7 years of experience in financial services developing solutions for Portfolio Management, Trading, Compliance, Post-Trade, IBOR or Wealth Management is strongly desired Authoritative experience with object-oriented programming, compiler or interpreter technologies, embedded systems, operating systems, relational databases (RDBMS), scripting and new/advanced programming languages Able to contribute to complex design specs in consultation with senior staff Able to work on medium to large projects with no supervision and on more complex tasks with minimal oversight Excellent written and verbal communication skills Able to work well with peers in a collaborative team environment A minimum of 5 years working with an Agile development methodology strongly desired Job ID: R-774653
Posted 4 weeks ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills.
Posted 4 weeks ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Key Responsibilities: Design, build, and maintain scalable data pipelines on Snowflake. Possessing experience or knowledge in Snow pipe, Time Travel, and Fail Safe. Write and optimize SQL queries for data extraction and transformation. Develop ETL processes to integrate various data sources into Snowflake. Monitor and troubleshoot data warehouse performance issues. Implement security measures and data governance practices. Having sound knowledge on snowflake architecture. Having knowledge on fivetran is addon advantage Collaborate with cross-functional teams to support analytical and reporting needs. Experience : 2 to 8 Years Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience with Snowflake and data warehousing concepts. Proficiency in SQL and ETL tools (e.g., Talend, Informatica, etc.). Company Details: One of the top ranked IT Companies in Ahmedabad, Gujarat. We are ISO 9001:2015 & ISO 27001:2013 certified leading global technology solution provider. Globally present, core focus is on USA, Middle East, Canada for services. Constantly enhancing our span of services around custom software development, Enterprise Mobility Solutions, and the Internet of Things. Family of multicultural and multi-talented passionate and well experienced resources who consistently work to set new standards for customer satisfaction by implementing industry best practices. Why Stridely? · You will have opportunities to work on international enterprise-level projects of big-ticket size · Interaction and co-ordination with US customer · Employee-First approach along with customer-first approach · Continuous learning, Training, and knowledge enhancements opportunities · Self-development, Career, and growth concealing within the origination · Democratic and Metro Culture · Strong and solid leadership · Seeing your potential you will get Overseas visits, transfers, and exposure URL: www.stridelysolutions.com Employee strength: 500+ Working Days: 5 Days a week One of the top ranked IT Companies in Ahmedabad, Gujarat. We are ISO 9001:2015 & ISO 27001:2013 certified leading global technology solution provider. Globally present, core focus is on USA, Middle East, Canada for services. Constantly enhancing our span of services around custom software development, Enterprise Mobility Solutions, and the Internet of Things. Family of multicultural and multi-talented passionate and well experienced resources who consistently work to set new standards for customer satisfaction by implementing industry best practices. Why Stridely? · You will have opportunities to work on international enterprise-level projects of big-ticket size · Interaction and co-ordination with US customer · Employee-First approach along with customer-first approach · Continuous learning, Training, and knowledge enhancements opportunities · Self-development, Career, and growth concealing within the origination · Democratic and Metro Culture · Strong and solid leadership · Seeing your potential you will get Overseas visits, transfers, and exposure URL: www.stridelysolutions.com Employee strength: 500+ Working Days: 5 Days a week Location: Ahmedabad/ Pune/ Vadodara
Posted 4 weeks ago
5.0 years
0 Lacs
India
Remote
Job Position : Talend ETL Developer Experience: 5+ Years Location : Remote Notice Period : Immediate Joiners We are looking for an experienced Talend ETL Developer with a strong background in data integration, particularly with Kinaxis Maestro Planning Server . The ideal candidate will be responsible for designing, developing, deploying, and monitoring Talend ETL jobs, ensuring data accuracy, scalability, and process efficiency across enterprise systems such as SAP, Kinaxis, and various Data Warehouses. Key Responsibilities Design, develop, and maintain Talend ETL jobs to integrate data into Kinaxis Maestro Planning Server . Deploy Talend jobs from Nexus repository into Talend Administration Center (TAC) . Schedule and monitor jobs using TAC with cron-based or file-based triggers. Create optimized execution plans for sequential or parallel processing of jobs. Integrate Talend workflows with XML, JSON, REST APIs, SFTP, flat files , and other data sources. Work with Kinaxis extractors to pull data from SAP/APO and data lakes. Collaborate with cross-functional teams for data validation, performance tuning, and alignment with business objectives. Perform unit testing and share results with business users for approval. Ensure adherence to coding standards , maintain technical documentation, and support CI/CD pipelines. Required Skills & Qualifications 5+ years of hands-on experience with Talend ETL development. Strong experience with ETL design , data transformation, and Talend components (File, Database, API). Expertise in SQL and relational database management. Experience deploying jobs via Nexus and managing them in TAC Job Conductor . Proficiency in API integration using GET/POST requests. Familiarity with Kinaxis extractors and ERP systems like SAP/APO . Working knowledge of SFTP , XML, JSON, and RESTful APIs. Basic knowledge of Java or scripting languages ( R, JavaScript ) for automation. Experience with Git for version control and CI/CD pipelines . Tech Stack Must Have: Talend SQL Git CI/CD Good to Have: Java (Basic) NoSQL SFTP SAP APO Kinaxis
Posted 4 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role description Skill Big Data Spark SQL Scala Mandatory Certification any one Cloudera CCA Spark and Hadoop Developer CCA1 75Databricks Certified Developer Apache Spark 2XHortonworks HDP Certified Apache Spark Developer Job Description Experience in Scala programming languages Experience in Big Data technologies including Spark Scala and Kafka Who have a good understanding of organizational strategy architecture patterns Microservices Event Driven and technology choices and coaching the team in execution in alignment to these guidelines Who can apply organizational technology patterns effectively in projects and make recommendations on alternate options Who have hand son experience working with large volumes of data including different patterns of data ingestion processing batch real time movement storage and access for both internal and external to BU and ability to make independent decisions within scope of project Who have a good understanding of data structures and algorithms Who can test debug and fix issues within established SL As Who can design software that is easily testable and observable Who understand how teams goals fit a business need Who can identify business problems at the project level and provide solutions Who understand data access patterns streaming technology data validation data performance cost optimization on Strong SQL skills ETL TalenD preferred Any other ETL tool Experience with Linux OS user level Python or R programming skills good to have but not mandatory Mandatory Skill s : Apache Spark,Scala,SparkSQL,Big Data Hadoop Ecosystem
Posted 4 weeks ago
7.0 - 8.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
.Net lead requirement for ITC Infotech- Kolkata. Job Location- Kolkata 5days work form office Mode of interview- 1st round through MS team and 2nd round Face to face discussion. Required candidates who can join within 30days. Experience : 7 to 8 Years Required Skills Programming skills: .NET Core (C#), Angular, React, JavaScript, and SQL Server. Data formats: JSON and XML data formats, including parsing, serialization, and deserialization. Integration experience: Experience with integrating manufacturing systems (MES, SCADA, etc.) with enterprise applications (SAP, ERP, etc.) is a plus but not mandatory ETL tools: Experience with ETL tools such as Microsoft SSIS, Talend, or Apache NiFi. SAP integration: Experience integrating with SAP systems using SAP PI, SAP PO, or other integration technologies. Database skills: Strong understanding of database design, data modelling, and SQL Server. Deployment and debugging: Experience with deploying and debugging bespoke applications. Key Responsibilities Design and develop integrations: Between manufacturing systems (MES, SCADA, etc.) and enterprise applications (SAP, ERP, etc.) using .NET & .NET Core, JavaScript, JSON, XML, and SQL Server. Bespoke Development: Design, develop, test, deploy & debug custom solutions using .NET Core to meet specific business requirements Deployment: Deploy bespoke application to production environments, ensuring smooth transition and minimal downtime. Debugging: Troubleshoot and debug to identify and resolve issues in applications SAP integration: Experience integrating with SAP systems using SAP PI, SAP PO, or other integration technologies. Integrate with ETL tools: Experience with ETL tools such as Microsoft SSIS Apache Nifi, or Talend. Data modelling and database design: Design and implement data models and database schemas to support integration requirements. Troubleshooting and support: Provide technical support and troubleshooting for integration issues. Collaborate with stakeholders: including manufacturing teams, IT teams, and external partners, to ensure successful project delivery. Nice to Have Skills MES and SCADA systems: Knowledge of Manufacturing Execution Systems (MES) and Supervisory Control and Data Acquisition (SCADA) systems. OPC applications: Experience with OPC (Open Platform Communications) applications and protocols. Cloud computing platforms: Experience with cloud computing platforms such as AWS, Azure, Industrial IoT: Knowledge of Industrial IoT (IIoT) concepts and technologies.
Posted 4 weeks ago
10.0 - 15.0 years
30 - 40 Lacs
Noida
Work from Office
About the Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About the Role: We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties and Responsibilities: Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications: 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools: Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies: Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience: Proficiency with AWS S3 for data storage and management Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge: Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages: Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills: Strong understanding of relational databases and NoSQL databases. Version Control: Experience with version control systems (e.g., Git). Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. Communication: Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Preferred Qualifications: Certifications in AWS Data Analytics or other relevant areas.
Posted 4 weeks ago
8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Greetings from Analytix Solutions…!!! We are seeking an experienced and motivated Senior Data Engineer to join our AI & Automation team . The ideal candidate will have 6–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data-driven insights across the organization. Company Name: Analytix Business Solutions (US Based MNC) Company At Glance: We are a premier knowledge process outsourcing unit based in Ahmedabad, fully owned by Analytix Solutions LLC, headquartered in the USA. We customize a wide array of business solutions including IT services, Audio-Visual services, Data management services & Finance and accounting services for small and mid-size companies across diverse industries. We partner with and offer our services to Restaurants, Dental services, Dunkin' Donuts franchises, Hotels, Veterinary services, and others including Start-ups from any other industry. For more details about our organization, please click on https://www.analytix.com/ LinkedIn : Analytix Business Solutions (India) Pvt. Ltd. : My Company | LinkedIn Roles & Responsibilities : Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. Architect and optimize data storage solutions to ensure reliability, security, and scalability. Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. Develop and enforce data quality standards, governance policies , and monitoring systems to ensure data integrity. Create and maintain comprehensive documentation for data systems, workflows, and models. Implement data modeling best practices and optimize data retrieval processes for better performance. Stay up-to-date with emerging technologies and bring innovative solutions to the team. Competencies & Skills : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. 6–8 years of experience in data engineering , designing and managing large-scale data systems. Advanced knowledge of Database Management Systems and ETL/ELT processes . Expertise in data modeling , data quality , and data governance . Proficiency in Python programming , version control systems (Git), and data pipeline orchestration tools . Familiarity with AI/ML technologies and their application in data engineering. Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. Excellent communication skills , with the ability to explain technical concepts to non-technical stakeholders. Ability to work independently, lead projects, and mentor junior team members. Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. Technology Stacks : Strong expertise in database technologies, including: o SQL Databases : PostgreSQL, MySQL, SQL Server o NoSQL Databases : MongoDB, Cassandra o Data Warehouse/ Unified Platforms : Snowflake, Redshift, BigQuery, Microsoft Fabric Hands-on experience implementing and working with generative AI tools and models in production workflows. Proficiency in Python and SQL , with experience in data processing frameworks (e.g., Pandas, PySpark). Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms . Strong understanding of data architecture , data modeling , and data governance principles . Experience with cloud platforms (preferably Azure ) and associated data services. Our EVP (Employee Value Proposition) : 5 Days working Total 24 Earned & Casual leaves and 8 public holidays Compensatory Off Personal Development Allowances Opportunity to work with USA clients Career progression and Learning & Development Loyalty Bonus Benefits Medical Reimbursement Standard Salary as per market norms Magnificent & Dynamic Culture
Posted 4 weeks ago
8.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
Remote
Job Title : Freelance Data Pipeline Engineer ETL, OLAP, Healthcare (HL7, FHIR) Employment Type : Full-Time Freelancer (Independent Contractor) Work Mode : Remote / Permanent Work From Home Work Schedule : Monday to Friday, 8 : 00 PM 5 : 00 AM IST (Night Shift) Compensation : 95K/ -100K -per month (subject to applicable TDS deductions) Contract Duration : Initial three-month engagement, extendable based on performance and project requirements Required Skills & Qualifications 8+ years of experience in data engineering, ETL development, and pipeline automation. Strong understanding of data warehousing and OLAP concepts. Proven expertise in handling healthcare data using HL7, FHIR, CCD/C-CDA. Proficiency in SQL and scripting languages such as Python or Bash. Experience with data integration tools (e.g., Apache NiFi, Talend, Informatica, SSIS). Familiarity with cloud data services (AWS, Azure, or GCP) is a plus. Strong knowledge of data quality assurance, data profiling, and data governance. Bachelors degree in Computer Science, Information Systems, or a related field (Masters preferred). For More Details Kindly share your resume : (ref:hirist.tech)
Posted 4 weeks ago
1.0 - 3.0 years
3 - 8 Lacs
Hyderabad, Pune
Work from Office
Responsibilities Build, maintain, and support applications in a global software platform and various other corporate systems, tools, and scripts Collaborate with other internal groups to translate business and functional requirements into technical implementation for the automation of existing processes and the development of new applications Communicate with internal customers in non-technical terms, understand business requirements, and propose solutions Manage projects from specification gathering, to development, to QA, user acceptance testing, and deployment to production Document changes and follow proper SDLC procedures Enhances team and coworkers through knowledge sharing and implementing best practices in day to day activities Takes initiative to continually learn and enhance technical knowledge and skills. Knowledge and Experience BS degree preferably in CS or EE, or a related discipline 2 3 yr. experience as an integration developer using applications like Talend or MuleSoft or any other. Familiarity with building multi-threaded application, and some understanding of distributed system like Kafka, Rabbit MQ Experience in developing REST based services Familiarity with different data formats like JSON, XML etc. High proficiency in RDBMS concepts and SQL Understanding of design patterns and object-oriented design concepts Experience with deployment automation tools such as Jenkins, Artifactory, Maven Strong written and verbal communication skills Ability to multitask and work independently on multiple projects Preferred Linux, Bash, SSH Familiarity Experience with application like Salesforce, ServiceNow, ORMB and other financial applicatons Financial industry expertise
Posted 4 weeks ago
1.0 - 2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description WHO WE ARE Come join our Technology Team and start reimagining the future of the automotive aftermarket. We are a highly motivated tech-focused organization, excited to be amid dynamic innovation and transformational change. Driven by Advance’s top-down commitment to empowering our team members, we are focused on delighting our Customers with Care and Speed, through delivery of world class technology solutions and products. We value and cultivate our culture by seeking to always be collaborative, intellectually curious, fun, open, and diverse. You will be a key member of a growing and passionate group focused on collaborating across business and technology resources to drive forward key programs and projects building enterprise capabilities across Advance Auto Parts. The Opportunity Join the AAP team and start reimagining the future of automotive retail. Disrupt the way consumers buy auto parts and take on the industry’s biggest challengers to execute on AAP's top-down commitment to digital expansion. As a member of the Advance Auto Parts team, you will have an opportunity to disrupt a $150B auto parts industry to bring better and faster solutions to customers. You will be part of a team helping the company live its mission of “Advancing a World in Motion”. The role is part of a merit-based organization with a culture of professional growth and development, and emphasis on the latest tools, platforms and technologies. Responsibilities Help in the migration and modernization of data platforms, moving applications and pipelines to Google Cloud-based solutions. Ensure data security and governance, enforcing compliance with industry standards and regulations. Help in managing and scale data pipelines from internal and external data sources to support new product launches and ensure high data quality. Help in developing automation and monitoring frameworks to capture key metrics and operational KPIs for pipeline performance. Collaborate with internal teams, including data science and product teams, to drive solutioning and proof-of-concept (PoC) discussions. Help in developing and optimizing procedures to transition data into production. Create and maintain technical documentation for sharing knowledge. Help in developing reusable packages and libraries to enhance development efficiency. Help in developing real-time and batch data processing solutions, integrating structured and unstructured data sources. Required Qualification We are looking for a candidate with 1-2 years of experience in Data Engineering and Application development. They must have a graduate degree in Computer Science or a related field of study. They must have experience with programming languages such as Python, Java & DS&Algo, Spark, and Scala. Expertise in Python and Spark is a must. Exposure of AWS and Cloud technologies. Experience in data platform engineering, with a focus on cloud transformation and modernization. Hands-on experience building large, scaled data pipelines in cloud environments and handling of data in PBs. Experience with CI/CD pipeline management in GCP DevOps. Understanding of data governance, security, and compliance best practices. Experience working in an Agile development environment. Prior experience in migrating applications from legacy platforms to the cloud. Knowledge of Terraform or Infrastructure-as-Code (IaC) for cloud resource management. Familiarity with Kafka, Event Hubs, or other real-time data streaming solutions. Experience with legacy RDBMS (Oracle, DB2, Teradata) & DataStage/Talend Having background supporting data science models in production. California Residents Click Below For Privacy Notice https://jobs.advanceautoparts.com/us/en/disclosures We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age national origin, religion, sexual orientation, gender identity, status as a veteran and basis of disability or any other federal, state or local protected class.
Posted 4 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills.
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France