Jobs
Interviews

1520 Talend Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 12.0 years

0 Lacs

Delhi, India

On-site

Greetings from TCS !!! TCS has been a great pioneer in feeding the fire of young techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together Job Title: Talend ETL Developer Experience: 5-12 years of experience in ETL development and data integration. Job Summary: We are looking for an experienced Talend ETL Developer. Candidate will be responsible for implementing ETL processes, ensuring efficient data connectivity and handling Talend job monitoring. He/she should have hands-on experience on Talend Data Integration, SQL, and an understanding of API functionality. Key Responsibilities: 1. Design, develop, and deploy ETL workflows using Talend. 2. Establish and manage secure database connections with external interfaces. 3. Create and schedule batch jobs for data processing. 4. Monitor and manage jobs using Talend Management Console. 5. Ensure data flow integrity between connected applications. 6. Analyze and resolve security vulnerabilities. 7. Understanding of data flow and data mapping. Required Skills: 1. Strong understanding of ETL processes and data integration. 2. Proficient in SQL and SQL Server. 3. Core Java [ Good to have] . 4. Experience with Talend Management Console hosted on Azure Cloud. 5. Knowledge of API functionality and connection protocols. [ Preferred ] 6. Experience with ESB integration. (Enterprise Service Bus) 7. Strong problem-solving skills.

Posted 3 weeks ago

Apply

6.0 - 9.0 years

12 - 18 Lacs

Bengaluru

Work from Office

Role & responsibilities Good in Talend Developer Snowflake and SQl

Posted 3 weeks ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities:- Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements.- Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements.- Develop and maintain technical documentation for SnapLogic integrations and workflows.- Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: - Must To Have Skills: Strong experience in SnapLogic.- Good To Have Skills: Experience in other ETL tools like Informatica, Talend, or DataStage.- Experience in designing, developing, and maintaining integrations and workflows using SnapLogic.- Experience in analyzing business requirements and developing solutions to meet those requirements.- Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic.- This position is based at our Pune office. Qualification 15 Years of full time education

Posted 3 weeks ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Data Management - Data Integration-Talend Preferred Skills: Technology-Data Management - Data Integration-Talend

Posted 3 weeks ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs How You Will Contribute You will: Operationalize and automate activities for efficiency and timely production of data visuals Assist in providing accessibility, retrievability, security and protection of data in an ethical manner Search for ways to get new data sources and assess their accuracy Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation Validate information from multiple sources. Assess issues that might prevent the organization from making maximum use of its information assets What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data Ability to simplify complex problems and communicate to a broad audience In This Role As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Data Science Analytics & Data Science

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About The Team About Workday At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. About The Team The Enterprise Data & AI Technologies and Architecture (EDATA) organization is a dynamic and evolving team that is spearheading Workday’s growth through trusted data excellence, innovation, and architectural thought leadership. Equipped with an array of skills in data science, engineering, and analytics, this team orchestrates the flow of data across our growing company while ensuring data accessibility, accuracy, and security. With a relentless focus on innovation and efficiency, Workmates in EDATA enable the transformation of complex data sets into actionable insights that fuel strategic decisions and position Workday at the forefront of the technology industry. EDATA is a global team distributed across the U.S, India and Canada. About The Role Join a pioneering organization at the forefront of technological advancement, dedicated to leveraging data-driven insights to transform industries and drive innovation. We are seeking a highly skilled and motivated Data Quality Engineer to join our dynamic team. The ideal candidate is someone who loves to learn, is detail oriented, has exceptional critical thinking and analytical skills. As a Data Quality Engineer, you will play a critical role in ensuring the accuracy, consistency, and completeness of our data across the enterprise data platform. You will be responsible for designing, developing, and implementing data quality processes, standards, and best practices across various data sources and systems to identify, resolve data issues. This role offers an exciting opportunity to learn, collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to drive data quality improvements and enhance decision-making capabilities. Responsibilities The incumbent will be responsible for (but not limited to) the following: Design and automate data quality checks; resolve issues and improve data pipelines with engineering and product teams. Collaborate with stakeholders to define data quality requirements and best practices. Develop test automation strategies and integrate checks into CI/CD pipelines. Monitor data quality metrics, identify root causes, and drive continuous improvements. Provide guidance on data quality standards across projects. Work with Data Ops to address production issues and document quality processes. About You Basic Qualifications 5+ years of experience as a Data Quality Engineer in data quality management or data governance. Good understanding of data management concepts, including data profiling, data cleansing, and data integration. Proficiency in SQL for data querying and manipulation. Develop and execute automated data quality tests using tools like SQL, Python (Pyspark), and data quality frameworks. Hands-on experience with cloud platforms (AWS/GCP), data warehouses (Snowflake, Databricks, Redshift), and integration tools (Snaplogic, dbt, Talend, etc.) Exposure to data quality tools (e.g., Acceldata, Tricentis) and CI/CD or DevOps practices is a plus. Experience with data quality monitoring tools (Acceldata, Tricentis) a plus. Other Qualifications Proven ability to prioritize and manage multiple tasks in a fast-paced environment. Certification in relevant technologies or data management disciplines is a plus. Analytical mindset with the ability to think strategically and make data-driven decisions. If you are a results-driven individual with a passion for data and analytics and a proven track record in data quality assurance, we invite you to apply for this exciting opportunity. Join our team and contribute to the success of our data-driven initiatives. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 3 weeks ago

Apply

12.0 years

4 - 6 Lacs

Hyderābād

On-site

Who we are looking for: Contribute to the enhancement and maintenance of the cloud native Vendor Analytics next generation workspace . Provide engineering troubleshooting assistance to customer support teams and other development teams within Charles River. We are looking for a back-end, cloud native engineer to help build out a workspace with a cloud native, Java SpringBoot back end deployed to Kubernetes. What you will be responsible for: Work under minimal supervision to analyze, design, develop, test, and debug medium to large software enhancements and solutions within Charles River’s business and technical problem domains Collaborate with Business Analysts and Product Managers to turn moderately complex business requirements into working and sustainable software Provide thought leadership in the design of product architecture within the team’s scope of responsibility Develop, test, debug, and implement software programs, applications and projects using Java, SQL, JavaScript or other related software engineering languages Provide informed guidance and direction in code reviews Write unit and automation tests to ensure a high-quality end product Assist in improving development test methodologies and contribute to related test methodology frameworks Conduct manual tests to ensure a high quality end product Contribute to written design and API documentation, and participate in customer documentation process Actively participate in the agile software development process by adhering to the CRD Scrum methodology including attending all daily standups, sprint planning, backlog grooming, and retrospectives Participate in cross-team group activities to complete assignments Provide mentoring to junior staff Qualifications: Education: B.Tech. degree (or foreign education equivalent) in Computer Science, Engineering, Mathematics, and Physics or other technical course of study required. M.Tech degree strongly preferred. Experience: 12+ years of progressively responsible professional software engineering experience preferably in a financial services product delivery setting Experience developing enterprise software deployed to one of the major cloud providers (Azure, AWS, Google) is essential. Experience with Java SpringBoot development in cloud native applications is mandatory. Nice to have experience with ETL tools Talend/Kettle Experience with GitHub is helpful Experience with REST and PostMan is helpful Experience with Kubernetes and Kafka is preferred. 4 to 7 years of experience using SQL including DDL and DML. Experience with Snowflake is a plus. 4 to 7 years of experience in financial services developing solutions for Portfolio Management, Trading, Compliance, Post-Trade, IBOR or Wealth Management is strongly desired Authoritative experience with object-oriented programming, compiler or interpreter technologies, embedded systems, operating systems, relational databases (RDBMS), scripting and new/advanced programming languages Able to contribute to complex design specs in consultation with senior staff Able to work on medium to large projects with no supervision and on more complex tasks with minimal oversight Excellent written and verbal communication skills Able to work well with peers in a collaborative team environment A minimum of 5 years working with an Agile development methodology strongly desired

Posted 3 weeks ago

Apply

4.0 years

4 Lacs

Hyderābād

On-site

About Us: Ventra is a leading business solutions provider for facility-based physicians practicing anesthesia, emergency medicine, hospital medicine, pathology, and radiology. Focused on Revenue Cycle Management, Ventra partners with private practices, hospitals, health systems, and ambulatory surgery centers to deliver transparent and data-driven solutions that solve the most complex revenue and reimbursement issues, enabling clinicians to focus on providing outstanding care to their patients and communities. Come Join Our Team! As part of our robust Rewards & Recognition program, this role is eligible for our Ventra performance-based incentive plan, because we believe great work deserves great rewards. Help Us Grow Our Dream Team — Join Us, Refer a Friend, and Earn a Referral Bonus! Job Summary: We are seeking a Snowflake Data Engineer to join our Data & Analytics team. This role involves designing, implementing, and optimizing Snowflake-based data solutions. The ideal candidate will have proven, hands-on data engineering expertise in Snowflake, cloud data platforms, ETL/ELT processes, and Medallion data architecture best practices. The data engineer role has a day-to-day focus on implementation, performance optimization and scalability. This is a tactical role requiring independent data analysis and data discovery to understand our existing source systems, fact and dimension data models, and implement an enterprise data warehouse solution in Snowflake. This role will take direction from the Lead Snowflake Data Engineer and Director of Data Engineering for their work while bringing their own domain expertise and experience. Essential Functions and Tasks: Participate in the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team. Implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies. Optimize Snowflake database performance Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers, to define and implement data solutions. Ensure data quality, integrity, and governance. Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform. Education and Experience Requirements: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 4+ years of experience in-depth data engineering, with at least 1+ minimum year(s) of dedicated experience engineering solutions in an enterprise scale Snowflake environment. Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. Strong experience with cloud platforms (preference to Azure) and their data services. Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran. Hands-on experience with scripting languages like Python for data processing. Snowflake SnowPro certification; preference to the engineering course path. Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. Familiarity with BI and visualization tools such as PowerBI. Knowledge, Skills, and Abilities: Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives. Ability to self-manage medium complexity deliverables and document user stories and tasks through Azure Dev Ops. Personal accountability to committed sprint user stories and tasks Strong analytical and problem-solving skills with the ability to handle complex data challenges Ability to read, understand, and apply state/federal laws, regulations, and policies. Ability to communicate with diverse personalities in a tactful, mature, and professional manner. Ability to remain flexible and work within a collaborative and fast paced environment. Understand and comply with company policies and procedures. Strong oral, written, and interpersonal communication skills. Strong time management and organizational skills. Compensation: Base Compensation will be based on various factors unique to each candidate including geographic location, skill set, experience, qualifications, and other job-related reasons. This position is also eligible for a discretionary incentive bonus in accordance with company policies. Ventra Health: Equal Employment Opportunity (Applicable only in the US) Ventra Health is an equal opportunity employer committed to fostering a culturally diverse organization. We strive for inclusiveness and a workplace where mutual respect is paramount. We encourage applications from a diverse pool of candidates, and all qualified applicants will receive consideration for employment without regard to race, color, ethnicity, religion, sex, age, national origin, disability, sexual orientation, gender identity and expression, or veteran status. We will provide reasonable accommodations to qualified individuals with disabilities, as needed, to assist them in performing essential job functions. Recruitment Agencies Ventra Health does not accept unsolicited agency resumes. Ventra Health is not responsible for any fees related to unsolicited resumes. Solicitation of Payment Ventra Health does not solicit payment from our applicants and candidates for consideration or placement. Attention Candidates Please be aware that there have been reports of individuals falsely claiming to represent Ventra Health or one of our affiliated entities Ventra Health Private Limited and Ventra Health Global Services. These scammers may attempt to conduct fake interviews, solicit personal information, and, in some cases, have sent fraudulent offer letters. To protect yourself, verify any communication you receive by contacting us directly through our official channels. If you have any doubts, please contact us at Careers@VentraHealth.com to confirm the legitimacy of the offer and the person who contacted you. All legitimate roles are posted on https://ventrahealth.com/careers/. Statement of Accessibility Ventra Health is committed to making our digital experiences accessible to all users, regardless of ability or assistive technology preferences. We continually work to enhance the user experience through ongoing improvements and adherence to accessibility standards. Please review at https://ventrahealth.com/statement-of-accessibility/.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai

On-site

Talend - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Good to have Talend knowledge and hands-on experience. Candidates worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work effectively within a global team environment. Excellent communication skills. Good to have - Production support experience. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Chennai

On-site

The Data Engineering Technology Lead is a senior level position responsible for establishing and implementing new or revised data platform eco systems and programs in coordination with the Technology team. The overall objective of this role is to lead data engineering team to implement the business requirements: Responsibilities: Design, build and maintain batch or real-time data pipelines in data platform. Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources. Develop ETL (extract, transform, load) processes to help extract and manipulate data from multiple sources. Automate data workflows such as data ingestion, aggregation, and ETL processing. Prepare raw data in Data Warehouses into a consumable dataset for both technical and non-technical stakeholders. Partner with data scientists and functional leaders in sales, marketing, and product to deploy machine learning models in production. Build, maintain, and deploy data products for analytics and data science teams on data platform Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures. Monitor data systems performance and implement optimization strategies. Leverage data controls to maintain data privacy, security, compliance, and quality for allocated areas of ownership. Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 12+ years of total experience & 7+ years relevant experience in Data engineering role Advanced SQL skills and experience with relational databases and database design. Strong proficiency in object-oriented languages: Python/Pyspark is must Experience working with data ingestion tools such as Talend & Ab Initio. Experience working with data lakehouse architecture such as Iceberg/Starburst Strong proficiency in scripting languages like Bash. Strong proficiency in data pipeline and workflow management tools Strong proficiency in scripting languages like Bash. Strong project management and organizational skills. Excellent problem-solving, communication, and organizational skills. Proven ability to work independently and with a team. Experience in managing and implementing successful projects Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Location - Pan India, Job Summary: We are seeking a highly skilled and motivated Data Quality Expert with 5–8 years of hands-on experience in implementing and managing data quality initiatives using tools like IBM Data Quality, IBM Cloud Pak for Data, or other industry-standard data quality platforms. The ideal candidate will work closely with business and technical teams to ensure data integrity, accuracy, completeness, and alignment with enterprise data governance policies. Key Responsibilities: Collaborate with data stewards, data owners, and business stakeholders to understand data quality requirements and priorities. Configure, implement, and maintain data quality rules, profiling, cleansing, and monitoring using tools like IBM Data Quality, IBM Cloud Pak for Data etc. Conduct regular data profiling and root cause analysis to identify data quality issues and define remediation plans. Define and enforce data quality metrics, dashboards, and scorecards to track data health. Participate in data governance activities, including metadata management, data lineage tracking, and policy enforcement. Support the implementation of Data Quality frameworks and standards across projects and domains. Work closely with the Data Governance team to align data quality initiatives with governance policies and regulatory compliance needs. Document data quality findings, technical specifications, workflows, and standard operating procedures. Recommend improvements to business processes, systems, and data flows to enhance data quality. Actively contribute to data quality assessments and audits as needed. Required Skills & Experience: 5 to 8 years of experience in Data Quality management and implementation. Hands-on expertise with one or more of the following tools: IBM Information Analyzer / IBM Data Quality IBM Cloud Pak for Data Informatica Data Quality (IDQ) Talend Data Quality Other similar tools are a plus. Strong understanding of data profiling, data cleansing, matching, standardization, and data validation techniques. Working knowledge of Data Governance concepts and frameworks (e.g., DAMA DMBOK). Good understanding of relational databases, data warehousing, ETL processes, and SQL. Familiarity with data catalogs, metadata management, and data lineage tools is desirable. Experience working in Agile environments and using tools like Jira, Confluence, or Azure DevOps. Strong analytical thinking, problem-solving skills, and attention to detail. Effective verbal and written communication skills for collaboration with business and technical teams. Preferred Qualifications: Bachelor's or Master’s degree in Computer Science, Information Systems, Data Science, or a related field. IBM Certified Data Quality or Cloud Pak for Data certification is a plus.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

Job Summary: We are seeking a skilled and analytical Data Architect & Business Intelligence Specialist to design, model, and implement robust data architectures, pipelines, and reporting frameworks. This role will be responsible for building and maintaining data models, overseeing data migrations, and developing scalable data warehouse solutions to support business intelligence and analytics initiatives. Key Responsibilities: 2. Data Pipelines & ETL/ELT 3. Data Migration 4. Data Warehousing 5. Business Intelligence & Reporting Data Architecture & Modeling Design and maintain the enterprise data architecture aligned with business and technical requirements. Develop logical and physical data models using industry best practices. Establish and maintain metadata standards and data dictionaries. Ensure data consistency, quality, and governance across all systems. Design and build efficient and scalable data pipelines for structured and unstructured data. Develop ETL/ELT processes using tools like Apache Airflow, Talend, Informatica, or Azure Data Factory. Optimize data ingestion, transformation, and loading procedures to support analytics. Plan and execute data migration projects from legacy systems to modern data platforms. Ensure data integrity and minimal downtime during migration activities. Collaborate with stakeholders to map old data structures to new architecture. Design, implement, and manage modern data warehouses (e.g., Snowflake, Redshift, BigQuery, Synapse). Ensure high performance, scalability, and security of data warehousing environments. Implement data partitioning, indexing, and performance tuning techniques. Collaborate with business stakeholders to gather reporting and analytics requirements. Build interactive dashboards and reports using tools like Power BI, Tableau, Looker, or Qlik. Enable self-service reporting and ensure data accuracy in BI platforms. Monitor data usage, performance, and drive continuous improvement in reporting frameworks. Requirements Requirements: Education & Experience: Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. 5+ years of experience in data architecture, modeling, pipelines, and BI/reporting. Technical Skills: Strong expertise in SQL, data modeling (3NF, dimensional, star/snowflake schemas). Experience with data warehouse technologies and cloud platforms (AWS, Azure, GCP). Proficiency in BI/reporting tools and data visualization best practices. Knowledge of Python, Scala, or other scripting languages is a plus. Familiarity with data governance, security, and compliance standards. Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with both technical and non-technical stakeholders. Ability to translate complex technical concepts into business language

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Who we are? Searce means ‘a fine sieve’ & indicates ‘to refine, to analyze, to improve’. It signifies our way of working: To improve to the finest degree of excellence, ‘solving for better’ every time. Searcians are passionate improvers & solvers who love to question the status quo. The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday. What are we looking for? Are you a keen learner? Excellent mentor? Passionate coach? We’re looking for someone who’s all three! We’re on the lookout for someone who can design and implement our data processing pipelines for all kinds of data sources. What you'll do as a Manager - Data Engineering with us? 1. You have worked in environments of different shapes and sizes. On-premise, private cloud, public cloud, Hybrid, all windows / linux / healthy mix. Thanks to this experience, you can connect the dots quickly and understand client pain points. 2. You are curious. You keep up with the breakneck speed of innovation on Public cloud. When something new gets released or an existing service changes - you try it out and you learn. 3. You have Strong database background - relational and non-relational alike. a. MySQL, PostgreSQL, SQL Server, Oracle. b. MongoDB, Cassandra and other NoSQL databases. c. Strong SQL query writing experience. d. HA, DR, Performance tuning, Migrations. e. Experience with the cloud offerings - RDS, Aurora, CloudSQL, Azure SQL. 4. You have hands-on experience with designing, deploying, migrating enterprise data warehouses and data lakes. a. Familiarity with migrations from the likes of Netezza, Greenplum, Oracle to BigQuery/RedShift/Azure Data warehouse. b. Dimensional data modelling, reporting & analytics. c. Designing ETL pipelines. 5. You have experience with Advanced Analytics - Ability to work with the Applied AI team and assist in delivering predictive analytics, ML models etc. 6. You have experience with BigData ecosystem a. Self managed Hadoop clusters, distributions like Hortonworks and the cloud equivalents like EMR, Dataproc, HDInsight b. Apache Hudi, Hive, Presto, Spark, Flink, Kafka etc 7. You have hands-on experience with Tools: Apache Airflow, Talend, Tableau, Pandas, DataFlow, Kinesis, Stream Analytics etc. What are the must-haves to join us? 1. Is Education overrated? Yes. We believe so. However there is no way to locate you otherwise. So we might have to look for a Bachelor's or Master's Degree in engineering from a reputed institute or you should have been coding since your 6th grade. And the later is better. We will find you faster if you specify the latter in some manner. :) 2. 8-10+ years of overall IT experience with a strong data engineering and business intelligence background. 3. Minimum 3 years of experience on projects with GCP / AWS / Azure. 4. Minimum 3+ years of experience in data & analytics delivery and management consulting working with Data Migration, ETL, Business Intelligence, Data Quality, Data Analytics and AI tools. 5. 4+ years of hands-on experience with Python & SQL. 6. Experience across data solutions including data lake, warehousing, ETL, streaming, reporting and analytics tools. 7. Prior experience in recruitment, training & grooming of geeks. 8. Great to have certifications: a. GCP and/or AWS, professional level b. Your contributions to the community - tech blogs, stackover ow etc. 9. Strong communication skills to communicate across a diverse audience with varying levels of business and technical expertise. So, If you are passionate about tech, future & what you read above (we really are!), apply here to experience the ‘Art of Possible’

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and maintain data pipelines for efficient data processing. - Ensure data quality and integrity throughout the data lifecycle. - Implement ETL processes to extract, transform, and load data. - Collaborate with cross-functional teams to optimize data solutions. - Conduct data analysis to identify trends and insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL. - Strong understanding of data integration and ETL processes. - Experience with data modeling and database design. - Knowledge of SQL and database querying languages. - Hands-on experience with data warehousing concepts. Additional Information: - The candidate should have a minimum of 3 years of experience in Talend ETL. - This position is based at our Hyderabad office. - A 15 years full-time education is required., 15 years full time education

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

On-site

The SQL Developer is responsible for designing, developing, and maintaining database systems and complex SQL queries to support business operations, reporting, and analytics. This role involves working with stakeholders to ensure data integrity, performance, and accessibility across multiple systems. Design, write, and optimize SQL queries, stored procedures, functions, and triggers Develop and maintain relational databases and data models Perform database tuning, performance monitoring, and optimization Create and maintain reports, dashboards, and data visualizations as needed Collaborate with software developers, data analysts, and business users to meet data needs Ensure data accuracy, consistency, and security across all database systems Document database structures, processes, and procedures Assist in database upgrades, backups, and recovery processes Analyze existing queries for performance improvements Support data migration, integration, and ETL (Extract, Transform, Load) processes Proficiency in writing complex SQL queries and working with relational databases (e.g., SQL Server, MySQL, PostgreSQL, Oracle) Experience with stored procedures, indexing, and query optimization Familiarity with database design and data normalization principles Knowledge of data warehousing and ETL tools (e.g., SSIS, Talend, Informatica) is a plus Understanding of BI tools (e.g., Power BI, Tableau) is beneficial Strong analytical and problem-solving skills Good communication and teamwork abilities

Posted 3 weeks ago

Apply

0.0 - 5.0 years

8 - 18 Lacs

Chennai, Tamil Nadu

On-site

Senior ETL Developer/Lead Data Engineer Job Summary Experience : 5-8 years Hybrid mode Full time/Contract Chennai Immediate joiner US shift timings Job Overview We are looking for a Senior ETL Developer who can take ownership of projects end-to-end , lead technical implementation, and mentor team members in ETL, data integration, and cloud data workflows. The ideal candidate will have 5–8 years of experience working with Talend , PostgreSQL , and AWS , and must be comfortable in a Linux environment . We are seeking a Senior ETL Developer with strong expertise in Talend , PostgreSQL , AWS , and Linux . The candidate should be able to take complete ownership of project execution—from design to delivery— while mentoring junior developers and driving technical best practices. The ideal candidate will have hands-on experience in data integration , cloud-based ETL pipelines , data versioning , and automation , and must be ready to work in a hybrid setup from Chennai or Madurai . · Design and implement scalable ETL workflows using Talend and PostgreSQL. · Handle complex data transformations and integrations across structured/unstructured sources. · Develop automation scripts using Shell/Python in a Linux environment. · Build and maintain stable ETL pipelines integrated with AWS services (S3, Glue, RDS, Redshift). · Ensure data quality, governance, and version control using tools like Git and Quilt. · Troubleshoot data pipeline issues and optimize for performance. · Schedule and manage jobs using tools like Apache Airflow, Cron, or Jenkins. · Mentor team members, review code, and promote technical best practices. · Drive continuous improvement and training on modern data tools and techniques. ETL & Integration · Must Have : Talend (Open Studio / DI / Big Data) · Also Good : SSIS, SSRS, SAS · Bonus : Apache NiFi, Informatica Databases · Required : PostgreSQL (3+ years) · Bonus : Oracle, SQL Server, MySQL Cloud Platforms · Required : AWS (S3, Glue, RDS, Redshift) · Bonus : Azure Data Factory, GCP · Certifications : AWS / Azure (Good to have) OS & Scripting · Required : Linux, Shell scripting · Preferred : Python scripting Data Versioning & Source Control · Required : Quilt, Git/GitHub/Bitbucket · Bonus : DVC, LakeFS, Git LFS Scheduling & Automation · Apache Airflow, Cron, Jenkins, Talend JobServer Bonus Tools · REST APIs, JSON/XML, Spark, Hive, Hadoop Visualization & Reporting · Power BI / Tableau (Nice to have) · Strong verbal and written communication. · Proven leadership and mentoring capabilities. · Ability to manage projects independently. · Comfortable adopting and teaching new tools and methodologies. · Willingness to work in a hybrid setup from Chennai or Madurai. Job Types: Full-time, Contractual / Temporary Pay: ₹800,000.00 - ₹1,800,000.00 per year Benefits: Flexible schedule Schedule: Evening shift Monday to Friday Rotational shift UK shift US shift Weekend availability Application Question(s): Do you have experience in AWS, GCP, Snowflakes, Data bricks ? If yes mention your field. Experience: ETL developer: 5 years (Required) Talend/Informatica : 5 years (Required) Location: Chennai, Tamil Nadu (Required) Work Location: In person

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Roles & Responsibilities At least 4 years of professional database development (SQL Server or Azure SQL Database) and ETL using Talend and SSIS in an OLAP/OLTP environments Bachelor’s or master’s degree in computer science or equivalent preferred Experience in continuous delivery environments Experience with Agile (SCRUM/Kanban) software development methodologies. Automated testing and deployment implementation a plus. Experience deploying and managing in-house/cloud-hosted data solutions Experience with large scale systems involving reporting, transactional systems and integration with other enterprise systems. Experience with Source/Version control systems. Successful history of working with high performing technology teams Technical Skills Proficiency with multiple ETL tools including Talend and SSIS (Azure Data Factory is bonus) Proficiency in SQL Development in Microsoft SQL Server (Azure SQL Databases is bonus) Experience with SQL Query Performance Optimization Experience with industry development standards and their implementation Proficiency in system analysis and design Analysis and verification technical requirements for completeness, consistency, feasibility, and testability. Identification and Resolution of technical issues through unit testing, debugging & investigation Version Control including branching and merging using services like GitHub or AzureDevOps Experience 4.5-6 Years Skills Primary Skill: SQL Development Sub Skill(s): SQL Development Additional Skill(s): SQL Development, Oracle PL/SQL Development, postgreSQL Development, ETL, SQL, Talend About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: The position is based in India and will require the candidate to report directly to the team lead Developing projects from detailed business requirements, working through solutions and managing execution and rollout of these solutions, in the context of an overall consistent global platform Create T SQL queries/stored Procedures/Functions/Triggers using SQL Server 2014 and 2017 Understand basic data warehousing concepts. Design/Develop SSIS packages to pull data from various source systems and load to target tables. May be required to develop Dashboards and Reports using SSRS Work on BAU JIRAs and also do some levels of L3 support related activities whenever required. Providing detailed analysis and documentation of processes and flows where necessary. Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging The candidate should have the ability to operate with a limited level of direct supervision exercising independence of judgement and autonomy. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 4-8 Yrs of overall IT experience with 2+ years in Financial Services industry. Strong understanding of Microsoft SQL Server, SSIS, SSRS, Autosys. 2+ yrs. experience in any ETL tool, preferably SSIS Some knowledge of Python can be a differentiator. Highly motivated, should need minimal hand holding with ability to multitask and work under pressure Strong analytical and problem solving skills; ability to analyze data for trends and quality checking Good to have: Talend, Github Knowledge Strong knowledge of database fundamentals and advanced concepts, ability to write efficient SQL, tune existing SQL Experience with a reporting tool (e.g. SSRS, QlikView) is a plus Experience with a job scheduling tool (e.g. Autosys) Experience in Finance Industry is desired Experience with all phases of Software Development Life Cycle Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills Microsoft SQL Server, Microsoft SQL Server Programming, MS SQL Queries. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties And Responsibilities Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools : Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies : Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience : Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge : Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages : Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills : Strong understanding of relational databases and NoSQL databases. Version Control : Experience with version control systems (e.g., Git). Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. Communication : Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. (ref:hirist.tech)

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Gandhinagar, Gujarat, India

On-site

Key Responsibilities Lead and mentor a high-performing data pod composed of data engineers, data analysts, and BI developers. Design, implement, and maintain ETL pipelines and data workflows to support real-time and batch processing. Architect and optimize data warehouses for scale, performance, and security. Perform advanced data analysis and modeling to extract insights and support business decisions. Lead data science initiatives including predictive modeling, NLP, and statistical analysis. Manage and tune relational and non-relational databases (SQL, NoSQL) for availability and performance. Develop Power BI dashboards and reports for stakeholders across departments. Ensure data quality, integrity, and compliance with data governance and security standards. Work with cross-functional teams (product, marketing, ops) to turn data into strategy. Qualifications : : PhD in Data Science, Computer Science, Engineering, Mathematics, or related field. 7+ years of hands-on experience across data engineering, data science, analysis, and database administration. Strong experience with ETL tools (e.g., Airflow, Talend, SSIS) and data warehouses (e.g., Snowflake, Redshift, BigQuery). Proficient in SQL, Python, and Power BI. Familiarity with modern cloud data platforms (AWS/GCP/Azure). Strong understanding of data modeling, data governance, and MLOps practices. Exceptional ability to translate business needs into scalable data solutions. (ref:hirist.tech)

Posted 3 weeks ago

Apply

3.0 - 6.0 years

8 - 10 Lacs

Bengaluru

Hybrid

Description: Role: Data Engineer/ETL Developer - Talend/Power BI Job Description: 1. Study, analyze and understand business requirements in context to business intelligence and provide the end-to-end solutions. 2. Design and Implement ETL pipelines with data quality and integrity across platforms like Talend Enterprise, informatica 3. Load the data from heterogeneous sources like Oracle, MSSql, File system, FTP services, Rest APIs etc.. 4. Design and map data models to shift raw data into meaningful insights and build data catalog. 5. Develop strong data documentation about algorithms, parameters, models. 6. Analyze previous and present data for better decision making. 7. Make essential technical changes to improvise present business intelligence systems. 8. Optimizing ETL processes for improved performance, monitoring ETL jobs and troubleshooting issues. 9. Lead and oversee the Team deliverables, ensure best practices are followed for development. 10. Participate/lead in requirements gathering and analysis. Required Skillset and Experience: 1. Over all up to 3 years of working experience, preferably in SQL, ETL (Talend) 2. Must have 1+ years of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, TAC etc. 3. Must have understanding of database design, data modeling 4. Hands-on experience in any of the coding language (Java or Python etc.) Secondary Skillset/Good to have: 1. Experience in BI Tool like MS Power Bi. 2. Utilize Power BI to build interactive and visually appealing dashboards and reports. Required Personal & Interpersonal Skills • Strong Analytical skills • Good communication skills, both written and verbal. • Highly motivated and result-oriented • Self-driven independent work ethics that drives internal and external accountability • Ability to interpret instructions to executives and technical resources. • Advanced problem-solving skills dealing with complex distributed applications. • Experience of working in multicultural environment. Enable Skills-Based Hiring No interested candidates reach out to akram.m@acesoftlabs.com 6387195529

Posted 3 weeks ago

Apply

7.0 - 12.0 years

22 - 37 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Mumbai & Bengaluru - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 710 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: The position is based in India and will require the candidate to report directly to the team lead Developing projects from detailed business requirements, working through solutions and managing execution and rollout of these solutions, in the context of an overall consistent global platform Create T SQL queries/stored Procedures/Functions/Triggers using SQL Server 2014 and 2017 Understand basic data warehousing concepts. Design/Develop SSIS packages to pull data from various source systems and load to target tables. May be required to develop Dashboards and Reports using SSRS Work on BAU JIRAs and also do some levels of L3 support related activities whenever required. Providing detailed analysis and documentation of processes and flows where necessary. Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging The candidate should have the ability to operate with a limited level of direct supervision exercising independence of judgement and autonomy. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 4-* Yrs. of overall IT experience with 2+ years in Financial Services industry. Strong understanding of Microsoft SQL Server, SSIS, SSRS, Autosys. 2+ yrs. experience in any ETL tool, preferably SSIS Some knowledge of Python can be a differentiator. Highly motivated, should need minimal hand holding with ability to multitask and work under pressure Strong analytical and problem solving skills; ability to analyze data for trends and quality checking Good to have: Talend, Github Knowledge Strong knowledge of database fundamentals and advanced concepts, ability to write efficient SQL, tune existing SQL Experience with a reporting tool (e.g. SSRS, QlikView) is a plus Experience with a job scheduling tool (e.g. Autosys) Experience in Finance Industry is desired Experience with all phases of Software Development Life Cycle Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Talend DI. Experience5-8 Years.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies