Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
2 Lacs
Hyderabad
Work from Office
Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse Implement best practices for data loading, ensuring optimal performance and data quality Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements Work on data modeling and schema design to optimize database structures for ETL processes Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading Troubleshoot and resolve issues related to data integration and performance bottlenecks Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions Provide guidance and mentorship to junior members of the data engineering team Create and maintain comprehensive documentation for ETL processes, data models, and data flows Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows Use Jira for task tracking and project management Implement data quality checks and validation processes to ensure data integrity and reliability Maintain detailed documentation of data engineering processes and solutions Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e g , Informatica Cloud, Talend, Apache NiFi) Expertise in IDMC principles, including data governance, data quality, and metadata management Solid understanding of data warehousing concepts and practices Strong SQL skills and experience working with relational databases Excellent problem-solving and analytical skills Qualified candidates should APPLY NOW for immediate consideration! Please hit APPLY to provide the required information, and we will be back in touch as soon as possible Thank you! ABOUT INNOVA SOLUTIONS: Founded in 1998 and headquartered in Atlanta, Georgia, Innova Solutions employs approximately 50,000 professionals worldwide and reports an annual revenue approaching $3 Billion Through our global delivery centers across North America, Asia, and Europe, we deliver strategic technology and business transformation solutions to our clients, enabling them to operate as leaders within their fields Recent Recognitions: One of Largest IT Consulting Staffing firms in the USA Recognized as #4 by Staffing Industry Analysts (SIA 2022) ClearlyRated Client Diamond Award Winner (2020) One of the Largest Certified MBE Companies in the NMSDC Network (2022) Advanced Tier Services partner with AWS and Gold with MS
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks
Posted 1 month ago
8.0 years
0 Lacs
Hyderābād
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Teradata SME We are seeking a highly experienced and knowledgeable Teradata Subject Matter Expert (SME) to provide deep technical expertise and strategic guidance on our existing Teradata data warehouse environment, with a focus on its integration, migration, and potential modernization within the Google Cloud Platform (GCP). You will be the go-to person for complex Teradata-related challenges, optimization initiatives, and architectural decisions, particularly as they relate to our cloud strategy on GCP. You will collaborate with data engineers, cloud architects, analysts, and business stakeholders to ensure our data landscape effectively leverages both Teradata and GCP capabilities. Responsibilities Serve as the primary point of contact and expert resource for all Teradata-related technical inquiries and issues, including those related to GCP integration. Provide deep technical expertise in Teradata architecture, utilities, performance tuning, and query optimization, with an understanding of how these aspects translate to or interact with GCP services. Lead efforts to integrate Teradata with GCP services for data ingestion, processing, and analysis. Provide guidance and expertise on potential migration strategies from Teradata to GCP data warehousing solutions like BigQuery . Optimize Teradata performance in the context of data pipelines that may involve GCP components. Troubleshoot and resolve complex Teradata system and application issues, considering potential interactions with GCP. Develop and maintain best practices, standards, and documentation for Teradata development and administration, with a focus on cloud integration scenarios. Collaborate with cloud architects and data engineers to design hybrid data solutions leveraging both Teradata and GCP. Provide guidance and mentorship to team members on Teradata best practices and techniques within a cloud-focused context. Participate in capacity planning and forecasting for the Teradata environment, considering its future within our GCP strategy. Evaluate and recommend Teradata upgrades, patches, and new features, assessing their compatibility and value within a GCP ecosystem. Ensure adherence to data governance policies and security standards across both Teradata and GCP environments. Stay current with the latest Teradata features, trends, and best practices, as well as relevant GCP data warehousing and integration services. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor's or Master's degree in Computer Science , Engineering, or a related field. Extensive and deep experience (typically 8+ years) working with Teradata data warehouse systems. Expert-level knowledge of Teradata architecture, including MPP concepts, BYNET, and storage management. Proven ability to write and optimize complex SQL queries in Teradata. Strong experience with Teradata utilities (e.g., BTEQ, FastLoad , MultiLoad , TPump). Deep understanding of Teradata performance tuning techniques, including workload management and query optimization. Experience with Teradata data modeling principles and best practices. Excellent analytical, problem-solving, and troubleshooting skills specific to Teradata environments, with an aptitude for understanding cloud integration. Strong communication , collaboration, and interpersonal skills, with the ability to explain complex technical concepts clearly, including those bridging Teradata and GCP. Familiarity with Google Cloud Platform (GCP) and its core data services (e.g., BigQuery , Cloud Storage, Dataflow). Preferred Qualifications/ Skills Teradata certifications. Google Cloud certifications (e.g., Cloud Architect, Data Engineer). Experience with Teradata Viewpoint and other monitoring tools. Knowledge of data integration tools (e.g., Informatica, Talend) and their interaction with both Teradata and GCP. Experience with workload management and prioritization in Teradata, and how it might be approached in GCP. Familiarity with data security concepts and implementation within both Teradata and GCP. Experience with migrating data to or from Teradata, especially to GCP. Exposure to cloud-based data warehousing solutions (specifically BigQuery ) and their architectural differences from Teradata. Scripting skills (e.g., Shell, Python) for automation of tasks across both Teradata and GCP. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 16, 2025, 11:49:57 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 month ago
10.0 - 12.0 years
9 - 9 Lacs
Hyderābād
On-site
Title: Data Integration Developer – Manager Department: Alpha Data Platform Reports To: Data Integration Lead, Engineering Summary: State Street Global Alpha Data Platform , lets you load, enrich and aggregate investment data. Alpha Clients will be able to manage multi-asset class data from any service provider or data vendor for a more holistic and integrated view of their holdings. This platform reflects State Street’s years of experience servicing complex instruments for our global client base and our investments in building advanced data management technologies. Reporting to the Alpha Development delivery manager in <
Posted 1 month ago
4.0 years
3 - 6 Lacs
Hyderābād
On-site
About Us: Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits at Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking a Technical Business Analyst role who will play a crucial role in ensuring smooth and efficient data migration and integration between diverse systems with varying architectures, databases, and APIs. This role is primarily responsible for translating complex business requirements into actionable specifications for data engineers to build and implement data pipelines. Responsibilities: Conduct thorough business analysis of source and target systems involved in data migrations and integrations. Develop a deep understanding of the functional and technical aspects of both systems, including their operational workflows and data structures. Identify and document system modules and their corresponding relationships between the two systems. Prepare migration/integration scoping documents that outline system objects to be migrated/integrated. Define and document detailed field-to-field data mapping for various objects, specifying how data fields from the source system map to the target system. Identify, analyze, and document migration criteria, considerations, limitations, and required data transformations. Collaborate with system owners, business stakeholders, and the data operations team to ensure migration requirements are fully captured and aligned with business objectives. Work closely with data engineers to facilitate automation of migration/integration processes. Support data validation and reconciliation efforts post-migration to ensure data accuracy and integrity. Maintain clear and structured documentation to support future migrations and integrations. The ideal candidate will bridge the gap between business and technical teams, ensuring successful and seamless data transfers. Competencies, Characteristics & Traits: Mandatory Experience: Minimum 4 years if experience in preparing specifications and experience in liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Good knowledge of data migration and engineering processes and concepts Proficiency in SQL and data analysis tools Understanding of cloud and on-premises database technologies and application services Experience with agile project practices Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders. Critical Thinking and collaboration skills Ability to analyze complex data issues, identify root causes, and propose solutions Skills and Experience: Essential: Experience liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Proven experience working with relational databases (e.g., SQL Server, Oracle, MySQL), data structures and APIs. Good knowledge of data migration and engineering processes and concepts Experience with data modeling documentation and related tools Proficiency in SQL and data analysis tools Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders Desirable: Understanding of cloud and on-premise database technologies and application services Experience with migration tools such as SnapLogic, Talend, Informatica, Fivetran, or similar. Industry-specific knowledge in Audit, Healthcare and Aviation is a plus Experience with agile project practices Business Analysis certifications (CBAP, CCBA, PMI-PBA) are a plus About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place!
Posted 1 month ago
8.0 years
20 - 28 Lacs
Gurgaon
On-site
Job Title: Tableau Developer Location: Gurgaon (Work Form Office) Job Type: Full Time Role Experience Level: 8-12 Years Job Summary: We are seeking a talented Tableau Developer to join our Business Intelligence and Analytics team. The ideal candidate will be responsible for designing, developing, and maintaining visually compelling and insightful dashboards and reports using Tableau. You will work closely with business stakeholders to understand requirements, translate data into actionable insights, and support data-driven decision-making. Key Responsibilities: Design and develop interactive Tableau dashboards, visualizations, and reports based on business needs. Collaborate with business analysts, data engineers, and stakeholders to gather requirements and define KPIs. Optimize dashboard performance and usability. Write complex SQL queries to extract and transform data from various sources (e.g., SQL Server, Oracle, Snowflake). Conduct data validation and ensure data quality and accuracy. Schedule and publish dashboards to Tableau Server / Tableau Online for end-user access. Provide training, documentation, and support to business users. Required Skills and Qualifications: Bachelor’s degree in Computer Science, Information Systems, Statistics, or related field. 8-12+ years of hands-on experience with Tableau Desktop and Tableau Server. Proficiency in SQL for data manipulation and analysis. Strong understanding of data warehousing concepts and relational databases. Ability to analyze large datasets and turn them into meaningful visual insights. Experience with data blending, LOD (Level of Detail) expressions, filters, parameters, and calculated fields in Tableau. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Knowledge of ETL tools (e.g., Alteryx, Talend, Informatica) or scripting languages (Python, R). Understanding of data governance and security principles. Tableau certification (Desktop Specialist, Certified Associate, etc.) is a plus. Exposure to Agile methodologies. Job Type: Full-time Pay: ₹2,000,000.00 - ₹2,800,000.00 per year Work Location: In person
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Bhubaneshwar
On-site
Position: Data Migration Engineer (NV46FCT RM 3324) Required Qualifications: 4–6 years of experience in data migration, data integration, and ETL development. Hands-on experience with both relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases Experience in Google BigQuery for data ingestion, transformation, and performance optimization. Proficiency in SQL and scripting languages such as Python or Shell for custom ETL logic. Familiarity with ETL tools like Talend, Apache NiFi, Informatica, or AWS Glue. Experience working in cloud environments such as AWS, GCP, or Azure. Solid understanding of data modeling, schema design, and transformation best practices. Preferred Qualifications: Experience in BigQuery optimization, federated queries, and integration with external data sources. Exposure to data warehouses and lakes such as Redshift, Snowflake, or BigQuery. Experience with streaming data ingestion tools like Kafka, Debezium, or Google Dataflow. Familiarity with workflow orchestration tools such as Apache Airflow or DBT. Knowledge of data security, masking, encryption, and compliance requirements in migration scenarios. Soft Skills: Strong problem-solving and analytical mindset with high attention to data quality. Excellent communication and collaboration skills to work with engineering and client teams. Ability to handle complex migrations under tight deadlines with minimal supervision. ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: BhubaneshwarNoida Experience: 4-6 years Notice period: 0-30 days
Posted 1 month ago
5.0 years
0 Lacs
Orissa
Remote
No. of Positions: 1 Position: Lead Data Engineer Location: Hybrid or Remote Total Years of Experience: 5+ years Key Responsibilities: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented. Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes. Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies. Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations. Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses. Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills: This job has no supervisory responsibilities. Bachelor’s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work. 5+ years’ experience with a strong proficiency with SQL query/development skills. Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks. Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory). Experience working in the healthcare industry with PHI/PII. Creative, lateral, and critical thinker. Excellent communicator. Well-developed interpersonal skills. Good at prioritizing tasks and time management. Ability to describe, create and implement new solutions. Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef). Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau). Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.
Posted 1 month ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA Show more Show less
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title: SQL Database Administrator Location: Nippon Q1 Business Centre, Kochi Experience: 4-6 Years Type: Full-time About Us: Expericia Technologies is a fast-growing IT services company specializing in enterprise applications, custom software development, SharePoint, .NET, Azure, React and more. We're passionate about technology, innovation and building solutions that resolve real-world problems. Join us to work on exciting projects, learn directly from industry veterans, and grow your career the right way. About the Role: We are looking for an experienced SQL SDA (SQL Developer/Administrator) with 4-6 years of expertise in database management, performance optimization, and ETL processes. The ideal candidate will handle database administration tasks, optimize performance, and support analytics by developing ETL processes for data transformation and reporting. Key Responsibilities: · Design, develop, and optimize SQL queries , stored procedures, and reports. · Perform data analysis and support decision-making with accurate, efficient reports. · Collaborate with business teams to provide tailored database solutions. · Optimize SQL queries and database performance, including troubleshooting and tuning. · Administer and manage SQL Server databases, ensuring availability, security, and data integrity. · Implement and manage ETL processes for data extraction, transformation, and loading. · Develop and maintain dashboards and reporting solutions using SQL and ETL tools. · Ensure data quality and troubleshoot any ETL-related issues. · Support database migrations, upgrades, and high-availability configurations. Skills and Qualifications: · 4-6 years of experience in SQL development and administration, with a focus on ETL processes. · Strong expertise in T-SQL, SQL Server, and ETL tools (e.g., SSIS, Talend). · Proficient in database performance tuning, query optimization, and backup/recovery strategies. · Strong problem-solving and analytical skills. · Bachelor’s degree in Computer Science or related field. Preferred Qualifications: · Experience with cloud migration and data warehousing solutions. · Experience with cloud platforms (AWS RDS, Azure SQL) · Familiarity with high-availability configurations and data integration. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It with Pride In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices to remain current in the field. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI, Tableau, Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Experience with RGM.ai product would have an added advantage. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyse data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
To lead the India-based team responsible for driving foundational data capabilities, supporting enterprise data governance, and ensuring high-quality data assets across Orange Business. This role is critical in operationalizing our data strategy and supporting data enablement across our business domains. Data Management Operations: Ensure operational execution of core data quality management activities including metadata management, data lineage, data observability, data quality monitoring and data cataloging. Team Leadership: Lead and develop a team of data quality experts in charge of the data quality remediation, monitoring and performance. Provide mentorship, guidance, and performance management and ensure new recruitments and a low turnover. Collaboration and Support: Partner closely with data owners, data stewards, and business units to support the implementation of data governance policies, data quality rules, and standards. Tools & Technologies: Administer and optimize enterprise data management tools (e.g., Collibra, Informatica, AccelData or equivalent). Ensure proper onboarding, user training, and operational support. Governance Alignment: Ensure alignment with the Orange Business Data Governance framework, providing execution support to Data Councils, Domain Owners, and Data Protection teams. Reporting & Metrics: Develop KPIs and dashboards to measure progress on data management maturity, quality improvement, and usage of data assets. Innovation & Best Practices: Promote continuous improvement, automation, and adoption of best practices in data management processes and tooling. People development: Accompany team skill developments and ensure knowledge sharing in the team Knowledge and abilities: Ability to understand the complexities of the Orange Business Data management landscape Strong understanding of data governance principles and regulatory frameworks (GDPR, etc.). Agile way of working with a can-do approach Expertise in metadata management, data cataloging, data quality, and master data processes would be a plus. Hands-on experience with enterprise data governance or data management platforms (Collibra, Informatica, Talend, Atlan, etc.). Excellent communication and stakeholder management skills. Education, qualifications, and certifications Bachelor's or Master’s degree in Computer Science, Information Systems, Data Management, or related field Other professional certification such as SCRUM, ITIL, PMP, SAFe will be an advantage. Experience A minimum of 8 years experience in data management, with at least 3 years in a team leadership or managerial roles. Experience working in a global matrix environment is a strong plus Knowledge of the telecom or B2B services sector is desirable Show more Show less
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are hiring for one the IT big4 consulting Designation: - Associate/Associate Consultant Location : - Chennai/Gurgaon/Pune Skills Req-: AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Key Responsibilities: - Work with business stakeholders to understand their business needs. - Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. - Clean, filter, and validate data to ensure it meets quality and format standards. - Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. - Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. - Optimize data pipelines and data storage for performance and efficiency. - Participate in code reviews and provide meaningful feedback to other team members. - Provide technical support and troubleshoot issue(s). Qualifications: - Bachelor’s degree in computer science, Information Technology, or a related field, or equivalent work experience. - Experience working in the AWS cloud platform. - Data engineer with expertise in developing big data and data warehouse platforms. - Experience working with structured and semi-structured data. - Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. - Experience working directly with technical and business teams. - Able to create technical documentation. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. Skillset (good to have) - Experience in data modeling. - Certified in AWS platform for Data Engineer skills. - Experience with ITSM processes/tools such as ServiceNow, Jira - Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow Show more Show less
Posted 1 month ago
15.0 years
0 Lacs
Greater Hyderabad Area
On-site
HCL Job Level : DGM - Data Management (Centre of Excellence) Domain : Multi Tower Role : Center of Excellence (Data Management) Role Location : Hyderabad , (Noida or Chennai secondary location). Positions : 1 Experience : 15+ years Job Profile Support Global Shared Services Strategy for Multi Tower Finance (P2P, O2C, R2R and FP&A) and Procurement tracks. Understand all processes in a detailed manner, inter-dependence, current technology landscape and organization structure Ensure end-to-end data lifecycle management including ingestion, transformation, storage, and consumption, while maintaining data reliability, accuracy, and availability across enterprise systems, with a strong focus on the Enterprise Data Platform (EDP) as the central data repository Collaborate with cross-functional teams to understand data requirements, identify gaps, and implement scalable solutions Define and enforce data quality standards, validation rules, and monitoring mechanisms, while leading the architecture and deployment of scalable, fault-tolerant, and high-performance data pipelines to ensure consistent and trustworthy data delivery Partner with IT and business teams to define and implement data access controls, ensuring compliance with data privacy and security regulations (e.g., GDPR, HIPAA Understand Governance and Interaction models with Client SMEs and drive discussions on project deliverables. Collaborate with business stakeholders to define data SLAs (Service Level Agreements) and ensure adherence through proactive monitoring and alerting Act as a bridge between business and IT, translating business needs into technical solutions and ensuring alignment with strategic goals Establish and maintain metadata management practices, including data lineage, cataloging, and business glossary development Propose feasible solutions, both interim and long term, to resolve the problem statements and address key priorities. Solutioning must be at a strategic level and at L2/ L3 Level Drive Alignment of processes, people, technology & best practices thereby enabling optimization, breaking silos, eliminating redundant methods and standardizing processes and Controls across entire engagement, on Data management. Identify process variations across regions and businesses and evaluate standardization opportunities through defining the Golden processes of Data collection and Data management. Required Profile/ Experience Deep understanding of all Finance towers and Procurement Strong understanding of data management principles, data architecture, and data governance Understanding and Hands-on experience with data integration tools, ETL/ELT processes, and cloud-based data platforms Demonstrate a proven track record in managing tool integrations and ensuring accurate, high-performance data flow, with strong expertise in data quality frameworks, monitoring tools, performance optimization techniques, and a solid foundation in data modeling, metadata management, and master data management (MDM) concepts Leadership Capability – should have relevant leadership experience in running large delivery operations and driving multiple enterprise level initiatives and Programs with High Business Impact. BPO Experience : Desired candidates should have relevant experience in BPO services especially in Americas. Transformation: Should have led and delivered at least 2-3 Data transformation Project regarding Application Integrations & Master Data management Tools and Industry Benchmarks – Should have knowledge of Industry wide trends on F&A Tools, platforms and benchmarks. (Azure Data Lake, AWS, GCP) Customer Facing skills: Should be proficient in leading meetings and presentations with customers using powerful product level material. Education Requirement B.E./B. Tech/MCA or equivalent in Computer Science, Information Systems, or related field Certifications in data management tools or platforms (e.g., Informatica, Talend, Azure Data Engineer, etc.) are preferred Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits At Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking a Technical Business Analyst role who will play a crucial role in ensuring smooth and efficient data migration and integration between diverse systems with varying architectures, databases, and APIs. This role is primarily responsible for translating complex business requirements into actionable specifications for data engineers to build and implement data pipelines. Responsibilities Conduct thorough business analysis of source and target systems involved in data migrations and integrations. Develop a deep understanding of the functional and technical aspects of both systems, including their operational workflows and data structures. Identify and document system modules and their corresponding relationships between the two systems. Prepare migration/integration scoping documents that outline system objects to be migrated/integrated. Define and document detailed field-to-field data mapping for various objects, specifying how data fields from the source system map to the target system. Identify, analyze, and document migration criteria, considerations, limitations, and required data transformations. Collaborate with system owners, business stakeholders, and the data operations team to ensure migration requirements are fully captured and aligned with business objectives. Work closely with data engineers to facilitate automation of migration/integration processes. Support data validation and reconciliation efforts post-migration to ensure data accuracy and integrity. Maintain clear and structured documentation to support future migrations and integrations. The ideal candidate will bridge the gap between business and technical teams, ensuring successful and seamless data transfers. Competencies, Characteristics & Traits Mandatory Experience: Minimum 4 years if experience in preparing specifications and experience in liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Good knowledge of data migration and engineering processes and concepts Proficiency in SQL and data analysis tools Understanding of cloud and on-premises database technologies and application services Experience with agile project practices Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders. Critical Thinking and collaboration skills Ability to analyze complex data issues, identify root causes, and propose solutions Essential Skills and Experience Experience liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Proven experience working with relational databases (e.g., SQL Server, Oracle, MySQL), data structures and APIs. Good knowledge of data migration and engineering processes and concepts Experience with data modeling documentation and related tools Proficiency in SQL and data analysis tools Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders Desirable Understanding of cloud and on-premise database technologies and application services Experience with migration tools such as SnapLogic, Talend, Informatica, Fivetran, or similar. Industry-specific knowledge in Audit, Healthcare and Aviation is a plus Experience with agile project practices Business Analysis certifications (CBAP, CCBA, PMI-PBA) are a plus About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place! Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Test Engineer Location: Hyderabad (Onsite) Experience Required: 5 Years Job Description: We are looking for a detail-oriented and skilled Test Engineer with 5 years of experience in testing SAS applications and data pipelines . The ideal candidate should have a solid background in SAS programming , data validation , and test automation within enterprise data environments. Key Responsibilities: Conduct end-to-end testing of SAS applications and data pipelines to ensure accuracy and performance. Write and execute test cases/scripts using Base SAS, Macros, and SQL . Perform SQL query validation and data reconciliation using industry-standard practices. Validate ETL pipelines developed using tools like Talend, IBM Data Replicator , and Qlik Replicate . Conduct data integration testing with Snowflake and use explicit pass-through SQL to ensure integrity across platforms. Utilize test automation frameworks using Selenium, Python, or Shell scripting to increase test coverage and reduce manual efforts. Identify, document, and track bugs through resolution, ensuring high-quality deliverables. Required Skills: Strong experience in SAS programming (Base SAS, Macro) . Expertise in writing and validating SQL queries . Working knowledge of data testing frameworks and reconciliation tools . Experience with Snowflake and ETL validation tools like Talend, IBM Data Replicator, Qlik Replicate. Proficiency in test automation using Selenium , Python , or Shell scripts . Solid understanding of data pipelines and data integration testing practices. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary: We are looking for a skilled ETL Tester with hands-on experience in validating data pipelines and data transformations in an AWS-based ecosystem . The ideal candidate should have a strong background in ETL testing, a solid understanding of data warehousing concepts, and proficiency with tools and services in AWS like S3, Redshift, Glue, Athena, and Lambda. Key Responsibilities: Design and execute ETL test cases for data ingestion, transformation, and loading processes. Perform data validation and reconciliation across source systems, staging, and target layers (e.g., S3, Redshift, RDS). Understand data mappings and business rules; write SQL queries to validate transformation logic. Conduct end-to-end testing including functional, regression, and performance testing of ETL jobs. Work closely with developers, data engineers, and business analysts to identify and troubleshoot defects . Validate data pipelines orchestrated through AWS Glue, Step Functions, and Lambda functions . Utilize Athena and Redshift Spectrum for testing data stored in S3. Collaborate using tools like JIRA, Confluence, Git, and CI/CD pipelines . Prepare detailed test documentation including test plans, test cases, and test summary reports. Required Skills: 3–4 years of experience in ETL/Data Warehouse testing . Strong SQL skills for data validation across large datasets. Working knowledge of AWS services such as S3, Redshift, Glue, Athena, Lambda, CloudWatch. Experience testing batch and streaming data pipelines . Familiarity with Python or PySpark is a plus for data transformation or test automation. Experience in using ETL tools (e.g., Informatica, Talend, or AWS Glue ETL scripts). Knowledge of Agile/Scrum methodology . Understanding of data quality frameworks and test automation practices . Good to Have: Exposure to BI tools like QuickSight, Tableau, or Power BI. Basic understanding of data lake and data lakehouse architectures . Experience in working with JSON, Parquet , and other semi-structured data formats. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greeting from Infosys BPM Ltd., We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 18th & 19th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 18th & 19th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Proficiency in Talend ETL development and integration with Snowflake. Hands-on experience with IBM Data Replicator and Qlik Replicate. Strong knowledge of Snowflake database architecture and Type 2 SCD modeling. Expertise in containerized DB2, DB2, Oracle, and Hadoop data sources. Understanding of Change Data Capture (CDC) processes and real-time data replication patterns. Experience with SQL, Python, or Shell scripting for data transformations and automation. Tools/Skills: Talend, IBM Data Replicator, Qlik Replicate, SQL, Python Skills: ibm,sql,ibm data replicator,snowflake database architecture,etl development,db2,type 2 scd modeling,shell scripting,talend,snowflake,python,change data capture (cdc),hadoop,qlik replicate,data,oracle Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience Level: 5–8 years in testing SAS applications and data pipelines. Proficiency in SAS programming (Base SAS, Macro, SQL) and SQL query validation. Experience with data testing frameworks and tools for data validation and reconciliation. Knowledge of Snowflake and explicit pass-through SQL for data integration testing. Familiarity with Talend, IBM Data Replicator, and Qlik Replicate for ETL pipeline validation. Hands-on experience with test automation tools (e.g., Selenium, Python, or Shell scripts). Skills: data validation,sql query validation,shell scripts,macro,sql,ibm data replicator,etl pipeline validation,pass-through sql,sas programming,base sas,sas,data testing frameworks,talend,snowflake,selenium,python,testing,data reconciliation,qlik replicate,data,test automation tools Show more Show less
Posted 1 month ago
0.0 - 7.0 years
0 Lacs
Karnataka
Remote
Role: Senior Analyst Experience: 2 to 7 years Location: Remote, Bengaluru Job Description: We are looking for a Data Analytics and Visualization Engineer who can help our team in the area of Business Intelligence. The candidate will deliver solutions that will drive desired business outcomes using KPIs, scorecards, metrics, and dashboards. An ideal candidate has strong software background, thrives in an entrepreneurial atmosphere, communicates efficiently, and can effectively architect, design and develop enterprise level dashboards and Datawarehouse solutions. Job Responsibilities: Work closely with Stakeholders to understand the requirements and apply analytics and visualization to achieve business objectives Engineer end-to-end solutions including data acquisition, data modeling, table creation and building dashboards and reports. Design and develop Tableau reports and dashboards that will yield actionable insights that present the answers to business questions Code and modify SQL/ETL based on dashboard requirements Analyze unstructured / Semi structured data and derive insights Run ad-hoc analysis for Product and Business Managers using standard query languages and operationalize for repeatable use via Tableau reporting suite Ensure designs support corporate IT strategy, established technical standards, and industry best practices Provide technical guidance to project and is responsible for developing and presenting design artifacts within the business and Development teams Identify project issues/risks and present alternatives to alleviate or resolve Core Competencies Strong quantitative and analytical skills - ability to quickly analyze data to identify key insights and apply them to the business Strong visualization design and development experience with Tableau (and other Business Intelligence tools like PowerBI) Experience leading analysis, architecture, design and development of business intelligence solutions and using next generation data platforms Experience developing test strategies for data-centric applications, in Agile methodologies and in diagnosing complex technical issues Strong understanding of architectural standards and software development methodologies, expertise in industry best practices in data architecture and design Excellent communication skills, including ability to present effectively to both business and technical audiences at all levels of the organization Who You are? Qualifications Bachelor’s degree in Engineering, Computer Science, or related field 6+ years of experience using business intelligence reporting tools, developing data visualizations and mastery of Tableau for the creation and automation of Enterprise Scale dashboards 6+ years of experience writing advanced SQL, performance tuning of BI queries, data modeling, and data mining from multiple sources (SQL, ETL, data warehousing) Experience performance tuning of Tableau Server dashboards to minimize rendering time Experience of data preparation/blending and ETL tools such as Alteryx or Talend or Tableau Prep Good knowledge on Tableau Metadata tables and Postgre SQL Server Reporting Experience in anyone programming languages like Python, R, etc would be a plus. Exposure to Snowflake or any Cloud data warehouse architecture would be an added advantage. Possess a strong foundation in data analytics, an understanding and exposure to data science Strong knowledge of data visualization and data warehouse best practices Certification in Tableau / Alteryx / Snowflake would be a plus Job Snapshot Updated Date 17-06-2025 Job ID J_3750 Location Remote, Karnataka, India Experience 3 - 7 Years Employee Type Permanent
Posted 1 month ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Key Responsibilities Develop and execute test cases for ETL processes and data migration across large datasets. Perform data validation and validation of source-to-target data mappings using advanced SQL. Collaborate with developers, Business Analysts (BAs), and Quality Assurance (QA) teams to ensure data quality and integrity. Report and track defects, ensuring timely resolution to maintain data accuracy and quality. Automate and optimize data workflows and ETL pipelines. Monitor the performance of data pipelines and troubleshoot any data issues as needed. Maintain detailed documentation for data processes, workflows, and system architecture. Ensure the data quality, integrity, and security across Skills & Qualifications : Experience in ETL/Data Warehouse testing or a similar role. Strong proficiency in SQL with a solid understanding of database concepts. Hands-on experience with ETL tools like Informatica, Talend, SSIS, or similar. Experience with data warehousing platforms (e.g., Snowflake) and performance tuning. Experience in defect tracking and issue management using tools like JIRA. Familiarity with version control systems (e.g., Git) and CI/CD practices. Good communication, collaboration, and documentation skills. Solid understanding of data warehousing principles and ETL process design. (ref:hirist.tech) Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France