Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Technology->Data Management - Data Integration->Talend Preferred Skills: Technology->Data Management - Data Integration->Talend
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary We are looking for a talented and motivated Data Engineer with 3 to 6 years of experience to join our data engineering team. The ideal candidate will have strong SQL skills, hands-on experience with Snowflake, ETL tools like Talend, DBT for transformation workflows, and a solid foundation in AWS cloud services. Key Responsibilities Design, build, and maintain efficient and reliable data pipelines using SQL, Talend, and DBT Develop and optimize complex SQL queries for data extraction and transformation Manage and administer Snowflake data warehouse environments Collaborate with analytics, product, and engineering teams to understand data requirements Implement scalable data solutions on AWS (e.g., S3, Lambda, Glue, Redshift, EC2) Monitor and troubleshoot data workflows and ensure data quality and accuracy Support deployment and version control of data models and transformations Write clear documentation and contribute to best practices Required Skills And Qualifications 3 to 6 years of experience in data engineering or related fields Strong expertise in SQL and performance tuning of queries Hands-on experience with Snowflake (data modeling, security, performance tuning) Proficiency with Talend for ETL development Experience with DBT (Data Build Tool) for transformation workflows Good knowledge of AWS services, especially data-centric services (S3, Lambda, Glue, etc.) Familiarity with Git-based version control and CI/CD practices Strong analytical and problem-solving skills Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Are you a data-driven professional with a knack for business intelligence and sales analytics? Do you excel at transforming complex datasets into actionable insights that drive business success? If yes, Marconix is looking for a Business Analyst (Sales Analyst) to join our team! Location: Hyderabad Salary: Up to ₹3 LPA (CTC) Work Mode: Work from Office Why Join Us? Work with a fast-growing and innovative sales solutions company Hands-on experience in business intelligence and sales analytics Opportunity to work with top-tier clients and industry leaders Sales Data Management & Reporting Transform raw sales data into valuable business insights using BI tools (Tableau, Power BI, etc.). Develop and deploy robust reporting dashboards for tracking performance metrics. Manage ETL processes (Extract, Transform, Load) to streamline data flow. Analyze large datasets to identify market trends and business growth opportunities. Business Intelligence & Analytics Develop data-driven strategies to optimize sales performance. Build predictive models to forecast sales trends and customer behavior. Conduct deep-dive analysis on business performance and suggest data-backed improvements. Work closely with stakeholders to understand their requirements and provide customized analytical solutions. Client & Team Management Act as the primary liaison between business and technical teams. Gather and analyze business requirements to enhance operational efficiency. Provide strategic recommendations to clients and internal teams based on data insights. What We Expect from You: Educational Background:Tech / BE / BCA / BSc in Computer Science, Engineering, or a related field. Experience: Relevant: 2+ years as a Business Analyst focusing on sales reporting & analytics Must-Have Skills: Strong expertise in BI tools (Tableau, Power BI, Oracle BI) Hands-on experience in ETL processes (Informatica, Talend, Teradata, Jasper, etc.) Solid understanding of data modeling, data analytics, and business reporting Excellent client management & stakeholder communication skills Strong analytical and problem-solving mindset Bonus Skills (Preferred but Not Mandatory): Experience in sales process automation & CRM analytics Exposure to AI & Machine Learning in sales analytics Show more Show less
Posted 1 month ago
3.0 - 5.0 years
10 - 20 Lacs
Pune
Work from Office
PharmaACE is a growing Global Healthcare Consulting Firm, headquartered in Princeton, New Jersey. Our expert teams of Business Analysts, based across the US, Canada, Europe, and India, provide Analytics and Business Solutions using our worldwide delivery models for a wide range of clients. Our clients include established, multinational BioPharma leaders and innovators, as well as entrepreneurial firms on the cutting edge of science. We have deep expertise in Forecasting, Business Analytics, Competitive Intelligence, Sales Analytics, and the Analytics Center of Excellence Model. Our wealth of therapeutic area experience cuts across Oncology, Immuno- science, CNS, CV-Met, and Rare Diseases. We support our clients' needs in Primary Care, Specialty Care, and Hospital business units, and we have managed portfolios in the Biologics space, Branded Pharmaceuticals, Generics, APIs, Diagnostics, and Packaging & Delivery Systems. Responsibilities: Working closely with Business teams/stakeholders across the pharmaceutical value chain and developing reports and dashboards that tell a story. Recommending KPIs and helping generate custom analysis and insights. Propose newer visualization ideas for our customers keeping in mind the audience type. Designing Tableau dashboards and reports that are self-explanatory. Keep the user at the center while designing the reports and thereby enhancing the user experience Requirement gathering while working closely with our Global Clients. Mentor other developers in the team for the Tableau-related technical challenges. Propagate Tableau best practices within and across the team. Ability to set up reports which can be maintained with ease and are scalable to other use cases. Interacting with the AI/ML team and incorporating new ideas into the final deliverables for the client. Work closely with cross teams like Advanced Analytics and Competitive Intelligence and Forecasting. Develop and foster client relationships and serve as a point of contact for projects. Qualifications and Areas of Expertise: Educational Qualification: BE/BTech/MTech/MCA from a reputed institute. Minimum 3-5 years of experience. • Proficient with tools including Tableau Desktop, Tableau Server, MySQL, MS Excel, and ETL tools (Alteryx, Tableau Prep or Talend). Knowledge of SQL. Experience in advanced LOD calcs, custom visualizations, data cleaning and restructuring. Strong analytical and problem-solving skills with the ability to question facts. Excellent written and oral communication skills. Nice to have: • A Valid US Business Visa. • Hands-on experience in Tableau, Python and, R. • Hands-on experience on Qlik Sense and Power BI. • Experience with Pharma / Healthcare data
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Viraaj HR Solutions is a dynamic HR consulting firm dedicated to optimizing human resource management and development. Our mission is to bridge the gap between talent and opportunity, driving growth and success for both our clients and candidates. We foster a culture of collaboration, innovation, and integrity, consistently striving to deliver exceptional service in the evolving landscape of human resources. Role Responsibilities Design, develop, and implement ETL processes using Talend. Collaborate with data analysts and stakeholders to understand data requirements. Perform data cleansing and transformation tasks. Optimize and automate existing data integration workflows. Monitor ETL jobs and troubleshoot issues as they arise. Conduct performance tuning of Talend jobs for efficiency. Document ETL processes and maintain technical documentation. Work closely with cross-functional teams to support data needs. Ensure data integrity and accuracy throughout the ETL process. Stay updated with Talend best practices and upcoming features. Assist in the migration of data from legacy systems to new platforms. Participate in code reviews to ensure code quality and adherence to standards. Engage in user training and support as necessary. Provide post-implementation support for deployed solutions. Evaluate and implement new data tools and technologies. Qualifications 3+ years of experience as a Talend Developer. Strong understanding of ETL principles and practices. Proficiency in Talend Open Studio. Hands-on experience with SQL and database management. Familiarity with data warehousing concepts. Experience using Java for Talend scripting. Knowledge of APIs and web services. Effective problem-solving skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and accuracy in data handling. Experience with job scheduling tools. Ability to manage multiple priorities and deadlines. Knowledge of data modeling concepts. Experience in documentation and process mapping. Skills: data cleansing,data warehousing,job scheduling tools,problem solving,team collaboration,sql,documentation,digital : talend open studio,talend,data transformation,data modeling,performance tuning,web services,api development,java,apis,data integration,etl processes Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Job Title – ETL Testing – Python & SQL Candidate Specification – 5+ years, Open for Shift – 1PM to 10 PM. ETL (Python) – all 5 days WFO, ETL (SQL) – Hybrid. Location – Chennai. Job Description Experience in ETL testing or data warehouse testing. Strong in SQL Server, MySQL, or Snowflake. Strong in scripting languages Python. Strong understanding of data warehousing concepts, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling. Proficient in writing SQL queries for data validation and reconciliation. Experience with testing tools such as HP ALM, JIRA, TestRail, or similar. Excellent problem-solving skills and attention to detail. Skills Required RoleETL Testing Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills ETL PYTHON SQL Other Information Job CodeGO/JC/185/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Job Title: Data Engineer Candidate Specification: 5 + years, Immediate to 30 days. (All 5 Days work from office for 9 Hours). Job Description Experience with any modern ETL tools (PySpark or EMR, or Glue or others). Experience in AWS, programming knowledge in python, Java, Snowflake. Experience in DBT, StreamSets (or similar tools like Informatica, Talend), migration work done in the past. Agile experience is required with Version One or Jira tool expertise. Provide hands-on technical solutions to business challenges & translates them into process/ technical solutions. Good knowledge of CI/CD and DevOps principles. Experience in data technologies - Hadoop PySpark / Scala (Any one) Skills Required RoleData Engineer Industry TypeIT/ Computers - Software Functional AreaIT-Software Required Education B Tech Employment TypeFull Time, Permanent Key Skills PYSPARK. EMR GLUE ETL TOOL AWS CI/CD DEVOPS Other Information Job CodeGO/JC/102/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 1 month ago
5.0 - 7.0 years
8 - 14 Lacs
Chennai
Work from Office
Business Analyst focused on designing and implementing a Data Governance strategy and Master Data Management (MDM) framework. This role will support the high-level design and detailed design phases of a transformative project involving systems such as 3DS PLM, SAP, Team Centre, and Blue Yonder. The ideal candidate will bring a blend of business analysis expertise, data governance knowledge, and automotive/manufacturing domain experience to drive workshops, map processes, and deliver actionable recommendations. Working closely with the GM of Master Data and MDM technical resources, you will play a pivotal role in aligning people, processes, and technology to achieve M&M's data governance and MDM objectives. Key Responsibilities : - Requirements Gathering & Workshops: Lead and facilitate workshops with business and IT stakeholders to elicit requirements, define data governance policies, and establish MDM strategies for automotive specific data domains (e.g. , parts, engineering data, bill of material, service parts, supplier and dealer master data). - Process Mapping & Design: Document and design master data-related processes, including data flows between systems such as 3DS, SAP, Talend, and Blue Yonder, ensuring alignment with business needs and technical feasibility. - Analysis & Recommendations: Analyse existing data structures, processes, and system integrations to identify gaps and opportunities; provide clear, actionable recommendations to support Data Governance and MDM strategy. - Stakeholder Collaboration: Act as a bridge between business units, IT teams, and technical resources (e.g. , 3DS specialists) to ensure cohesive delivery of the project objectives. - Documentation & Communication: Create high-quality deliverables, including process maps, requirement specifications, governance frameworks, and summary reports, tailored to both technical and non-technical audiences. - Support Detailed Design: Collaborate with the 3DS/Talend technical resource to translate high-level designs into detailed MDM solutions, ensuring consistency across people, process, and technology components. - Project Support: Assist the MDM Leadership in planning, tracking, and executing project milestones, adapting to evolving client needs. Experience : Required Skills & Qualifications : - 5+ years of experience as a Business Analyst, with a focus on data governance, master data management (MDM) such as Talend, Informatica, Reltio etc. - Proven track record of working on auto/manufacturing industry projects, ideally with exposure to systems like 3DS, Team Centre, SAP S/4HANA, MDG, or Blue Yonder. Technical Knowledge : - Strong understanding of MDM concepts, data flows, and governance frameworks. - Familiarity with auto-specific data domains (e.g. , ECCMA/E-Class Schema). - Experience with process modelling tools (e.g. , Visio, Lucid chart, or BPMN) and documentation standards. Soft Skills : - Exceptional communication and facilitation skills, with the ability to engage diverse stakeholders and drive consensus in workshops. - Methodical and structured approach to problem-solving and project delivery. - Ability to summarize complex information into clear, concise recommendations. - Education: Bachelor's degree in business, Information Systems, or a related field (or equivalent experience). - Certifications: Relevant certifications (e.g. , CBAP, PMP, or MDM-specific credentials) are a plus but not required. Preferred Qualifications : - Prior consulting experience in a client-facing role. - Hands-on experience with MDG, Talend, Informatica, Reltio etc. or similar MDM platforms. - Exposure to data quality analysis or profiling (not required to be at a Data Analyst level).
Posted 1 month ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About US At Particleblack, we drive innovation through intelligent experimentation with Artificial Intelligence. Our multidisciplinary team—comprising solution architects, data scientists, engineers, product managers, and designers—collaborates with domain experts to deliver cutting-edge R&D solutions tailored to your business. Our ecosystem empowers rapid execution with plug-and-play tools, enabling scalable, AI-powered strategies that fast-track your digital transformation. With a focus on automation and seamless integration, we help you stay ahead—letting you focus on your core, while we accelerate your growth Responsibilities & Qualifications Data Architecture Design: Develop and implement scalable and efficient data architectures for batch and real-time data processing.Design and optimize data lakes, warehouses, and marts to support analytical and operational use cases. ETL/ELT Pipelines: Build and maintain robust ETL/ELT pipelines to extract, transform, and load data from diverse sources.Ensure pipelines are highly performant, secure, and resilient to handle large volumes of structured and semi-structured data. Data Quality and Governance: Establish data quality checks, monitoring systems, and governance practices to ensure the integrity, consistency, and security of data assets. Implement data cataloging and lineage tracking for enterprise-wide data transparency. Collaboration with Teams:Work closely with data scientists and analysts to provide accessible, well-structured datasets for model development and reporting. Partner with software engineering teams to integrate data pipelines into applications and services. Cloud Data Solutions: Architect and deploy cloud-based data solutions using platforms like AWS, Azure, or Google Cloud, leveraging services such as S3, BigQuery, Redshift, or Snowflake. Optimize cloud infrastructure costs while maintaining high performance. Data Automation and Workflow Orchestration: Utilize tools like Apache Airflow, n8n, or similar platforms to automate workflows and schedule recurring data jobs. Develop monitoring systems to proactively detect and resolve pipeline failures. Innovation and Leadership: Research and implement emerging data technologies and methodologies to improve team productivity and system efficiency. Mentor junior engineers, fostering a culture of excellence and innovation.| Required Skills: Experience: 7+ years of overall experience in data engineering roles, with at least 2+ years in a leadership capacity. Proven expertise in designing and deploying large-scale data systems and pipelines. Technical Skills: Proficiency in Python, Java, or Scala for data engineering tasks. Strong SQL skills for querying and optimizing large datasets. Experience with data processing frameworks like Apache Spark, Beam, or Flink. Hands-on experience with ETL tools like Apache NiFi, dbt, or Talend. Experience in pub sub and stream processing using Kafka/Kinesis or the like Cloud Platforms: Expertise in one or more cloud platforms (AWS, Azure, GCP) with a focus on data-related services. Data Modeling: Strong understanding of data modeling techniques (dimensional modeling, star/snowflake schemas). Collaboration: Proven ability to work with cross-functional teams and translate business requirements into technical solutions. Preferred Skills: Familiarity with data visualization tools like Tableau or Power BI to support reporting teams. Knowledge of MLOps pipelines and collaboration with data scientists. Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Infosys BPM Limited hiring for Analyst role at Pune location. Analyst - Supplier master data Location - Pune Shift – Rotational Experience – JL4A – 6+ yrs Role/Responsibilities: Due diligence (kd Prevent) Vendor creation, Item creation, incl. validation of vendor particulars. Compliance/maintenance of vendor particulars, risk assessment, Technical Risk, Business Continuity, Limited Souricng Risk, Dependency Risk, Financial Risk, Sustainibilty Risk, Cyber Security Risk. Bank detail verification and update, Compliance check (with KD Prevent), Financial check (BVD database). Supplier code of conduct and anti-bribery questionnaire. Run kdPrevent report, manage result, store on ERP Receive vendor registration supporting documents (including code of conduct and anti-bribery questionnaire response), review, confirm, store on Shared Google at entity level. Create vendor on ERP, using information supplied and validated. MDM Strategy & Data Cleansing Strategy Development, Experience in classification, master data management (material, vendor and Customer master enrichment/cleansing), De-Duplication etc. SAP Functional knowledge – understanding overall SAP system structure. Multi-tasking Master Data Expert. Developing/ participating in new solutions, tools, methodologies, strategies in growing MDM Practice. Perform master data audits and validation to ensure conformance to business rules, standards, and metrics that meet business requirements. Data Design documentation preparation like Data Model, Data Standards, CRUD Matrix, IT System Integration Matrix. Should be able to drive a project implementation from Due diligence to the final signoff from the client and should also maintain the SLAs, on an agreed upon basis. Perform pre-analysis activities such as classifying invoice and PO records to a specified hierarchy, conduct data reviews to ensure high quality and accuracy of reports on which analysis is performed. Project collaboration: Work effectively both independently and as a member of cross-functional teams. Perform master data audits and validation to ensure conformance to business rules, standards, and metrics that meet business requirements. Skillset: Good understanding and work experience of Master Data and its impact on downstream processes. Min. 8 years of professional experience in Master Data Management (key data objects: Customer, Vendor, Material, Product). Fluent verbal and written communication skills, presentation skills. Excellent Data analysis and interpretation skills – proven skills in Excel modeling & analysis. Strong story-telling skills to deliver recommendations from data analysis – proven skills in PowerPoint, in a business-case presentation context. Knowledge and experience of key features of Master Data Management Platforms (Sap ECC, SAP MDG, Talend, Stibo, Collibra, Informatica, Winshuttle, etc.). AI/ ML based projects participation/ experience will be an asset. Fluent verbal and written communication skills, presentation skills. Self-motivated and takes ownership. Project and Team management skills. Skills in ERP, SQL and Data visualization. Interpersonal Skills and thought leadership. Effective communication and maintaining professional relations with the Client and Infosys. SAP Functional knowledge – understanding overall SAP system structure. Multi-tasking Master Data Expert. If interested, please share your updated resume with below details to merlin.varghese@infosys.com Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice Period: Current Location: Willing to Work from Office: Flexible with night shifts: Flexible to Relocate: Pune (if any): Regard's Infosys BPM Team. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Team Geek Solutions is a forward-thinking technology services provider dedicated to delivering innovative solutions that drive business success. Situated at the intersection of technology and creativity, we believe in fostering a collaborative environment that encourages growth and development. Our mission is to empower organizations through data-driven strategies and state-of-the-art technology solutions. We value teamwork, integrity, and excellence, ensuring our culture nurtures talent and inspires collective achievement. Job Title: Talend ETL Developer Job Location: Mumbai , Pune Role Responsibilities Design, develop, and maintain ETL processes using Talend. Implement data integration solutions to consolidate data from various systems. Collaborate with business stakeholders to understand data requirements. Develop and optimize SQL queries to extract and manipulate data. Perform data profiling and analysis to ensure data accuracy and quality. Monitor and troubleshoot ETL jobs to ensure smooth data flow. Maintain documentation for ETL processes and data model designs. Work with team members to design and enhance data warehouses. Develop data transformation logic to meet business needs. Ensure compliance with data governance and security policies. Participate in code reviews and contribute to team knowledge sharing. Support data migration initiatives during system upgrades. Utilize Agile methodology for project management and delivery. Manage workflow scheduling and execution of ETL tasks. Provide technical support and training to team members as needed. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Proven experience as an ETL Developer, mandatory with Talend. Strong understanding of ETL frameworks and data integration principles. Proficient in writing and troubleshooting SQL queries. Experience in data modeling and database design. Familiarity with data quality assessment methodologies. Ability to analyze complex data sets and provide actionable insights. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Ability to work collaboratively in a team-oriented environment. Knowledge of data warehousing concepts and best practices. Experience with Agile development methodologies is a plus. Willingness to learn new technologies and methodologies. Detail-oriented with a commitment to delivering high-quality solutions. Ability to manage multiple tasks and deadlines effectively. Experience with performance tuning and optimization of ETL jobs. Skills: data warehousing,troubleshooting,etl processes,workflow management,etl,data modeling,sql,performance tuning,data profiling and analysis,data governance,data integration,sql proficiency,agile methodology,talend Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Company Overview Team Geek Solutions is a forward-thinking technology services provider dedicated to delivering innovative solutions that drive business success. Situated at the intersection of technology and creativity, we believe in fostering a collaborative environment that encourages growth and development. Our mission is to empower organizations through data-driven strategies and state-of-the-art technology solutions. We value teamwork, integrity, and excellence, ensuring our culture nurtures talent and inspires collective achievement. Job Title: Talend ETL Developer Job Location: Mumbai , Pune Role Responsibilities Design, develop, and maintain ETL processes using Talend. Implement data integration solutions to consolidate data from various systems. Collaborate with business stakeholders to understand data requirements. Develop and optimize SQL queries to extract and manipulate data. Perform data profiling and analysis to ensure data accuracy and quality. Monitor and troubleshoot ETL jobs to ensure smooth data flow. Maintain documentation for ETL processes and data model designs. Work with team members to design and enhance data warehouses. Develop data transformation logic to meet business needs. Ensure compliance with data governance and security policies. Participate in code reviews and contribute to team knowledge sharing. Support data migration initiatives during system upgrades. Utilize Agile methodology for project management and delivery. Manage workflow scheduling and execution of ETL tasks. Provide technical support and training to team members as needed. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Proven experience as an ETL Developer, mandatory with Talend. Strong understanding of ETL frameworks and data integration principles. Proficient in writing and troubleshooting SQL queries. Experience in data modeling and database design. Familiarity with data quality assessment methodologies. Ability to analyze complex data sets and provide actionable insights. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Ability to work collaboratively in a team-oriented environment. Knowledge of data warehousing concepts and best practices. Experience with Agile development methodologies is a plus. Willingness to learn new technologies and methodologies. Detail-oriented with a commitment to delivering high-quality solutions. Ability to manage multiple tasks and deadlines effectively. Experience with performance tuning and optimization of ETL jobs. Skills: data warehousing,troubleshooting,etl processes,workflow management,etl,data modeling,sql,performance tuning,data profiling and analysis,data governance,data integration,sql proficiency,agile methodology,talend Show more Show less
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Coimbatore
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and maintain data pipelines for efficient data processing.- Ensure data quality and integrity throughout the data lifecycle.- Implement ETL processes to extract, transform, and load data.- Collaborate with cross-functional teams to optimize data solutions.- Conduct data analysis to identify trends and insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration and ETL processes.- Experience with data modeling and database design.- Knowledge of SQL and database querying languages.- Hands-on experience with data warehousing concepts. Additional Information:- The candidate should have a minimum of 3 years of experience in Talend ETL.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
2.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend ETL Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions to enhance business operations and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement ETL processes using Talend ETL tool.- Collaborate with cross-functional teams to gather and analyze data requirements.- Optimize and troubleshoot ETL processes for performance and efficiency.- Create and maintain technical documentation for ETL processes.- Assist in testing and debugging ETL processes to ensure data accuracy. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration concepts.- Experience with data modeling and database design.- Knowledge of SQL and database querying.- Familiarity with data warehousing concepts. Additional Information:- The candidate should have a minimum of 2 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Pune
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : A Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :Overall 7+ years of experience In Industry including 4 Years of experience As Developer using Big Data Technologies like Databricks/Spark and Hadoop Ecosystems - Hands on experience on Unified Data Analytics with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL - Good understanding of Spark Architecture with Databricks, Structured Streaming. Setting Up cloud platform with Databricks, Databricks Workspace- Working knowledge on distributed processing, data warehouse concepts, NoSQL, huge amount of data processing, RDBMS, Testing, Data management principles, Data mining and Data modellingAs a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data pipelines and data platform components.- Ensure data quality and integrity by implementing data validation and testing procedures. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with Apache Spark and Hadoop.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Proficiency in programming languages such as Python or Java.- Experience with data integration and ETL tools such as Apache NiFi or Talend. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Chennai, Bengaluru, Hyderabad and Pune office. Qualification A Engineering graduate preferably Computer Science graduate 15 years of full time education
Posted 1 month ago
5.0 - 10.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineer (Tester), you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suite. You will create automation strategy, automated scripts, and support data and environment configuration. Additionally, you will participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Conduct thorough testing of data warehouse ETL processes.- Develop and execute test cases, test plans, and test scripts.- Identify and document defects, issues, and risks.- Collaborate with cross-functional teams to ensure quality standards are met. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of SQL and database concepts.- Experience with ETL tools such as Informatica or Talend.- Knowledge of data warehousing concepts and methodologies.- Experience in testing data integration, data migration, and data transformation processes. Additional Information:- The candidate should have a minimum of 5 years of experience in Data Warehouse ETL Testing.- This position is based at our Bengaluru office.- 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySpark, Apache Spark, Talend ETLMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams, make team decisions, and provide solutions to problems. Engaging with multiple teams, you will contribute to key decisions and provide solutions for your immediate team and across multiple teams. In this role, you will have the opportunity to showcase your creativity and technical expertise in designing and building applications. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Design, build, and configure applications to meet business process and application requirements- Collaborate with cross-functional teams to gather and define application requirements- Develop and implement software solutions using the Databricks Unified Data Analytics Platform- Perform code reviews and ensure adherence to coding standards- Troubleshoot and debug applications to identify and resolve issues- Optimize application performance and ensure scalability- Document technical specifications and user manuals for applications- Stay updated with emerging technologies and industry trends- Train and mentor junior developers to enhance their technical skills Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with PySpark, Apache Spark, Talend ETL- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
16 - 20 Lacs
Hyderabad
Work from Office
We are Hiring Data Engineer for a US based IT Company Based in Hyderabad. Candidates with minimum 5 Years of experience in Data Engineering can apply. This job is for 1 year contract only Job Title: Data Engineer Location: Hyderabad CTC: Upto 20 LPA Experience: 5+ Years Job Overview: We are looking for a seasoned Senior Data Engineer with deep hands-on experience in Talend and IBM DataStage to join our growing enterprise data team. This role will focus on designing and optimizing complex data integration solutions that support enterprise-wide analytics, reporting, and compliance initiatives. In this senior-level position, you will collaborate with data architects, analysts, and key stakeholders to facilitate large-scale data movement, enhance data quality, and uphold governance and security protocols. Key Responsibilities: Develop, maintain, and enhance scalable ETL pipelines using Talend and IBM DataStage Partner with data architects and analysts to deliver efficient and reliable data integration solutions Review and optimize existing ETL workflows for performance, scalability, and reliability Consolidate data from multiple sourcesboth structured and unstructuredinto data lakes and enterprise platforms Implement rigorous data validation and quality assurance procedures to ensure data accuracy and integrity Adhere to best practices for ETL development, including source control and automated deployment Maintain clear and comprehensive documentation of data processes, mappings, and transformation rules Support enterprise initiatives around data migration , modernization , and cloud transformation Mentor junior engineers and participate in code reviews and team learning sessions Required Qualifications: Minimum 5 years of experience in data engineering or ETL development Proficient with Talend (Open Studio and/or Talend Cloud) and IBM DataStage Strong skills in SQL , data profiling, and performance tuning Experience handling large datasets and complex data workflows Solid understanding of data warehousing , data modeling , and data lake architecture Familiarity with version control systems (e.g., Git) and CI/CD pipelines Strong analytical and troubleshooting skills Effective verbal and written communication, with strong documentation habits Preferred Qualifications: Prior experience in banking or financial services Exposure to cloud platforms such as AWS , Azure , or Google Cloud Knowledge of data governance tools (e.g., Collibra, Alation) Awareness of data privacy regulations (e.g., GDPR, CCPA) Experience working in Agile/Scrum environments For further assistance contact/whatsapp: 9354909518 or write to priya@gist.org.in
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Preferred Education Master's Degree Required Technical And Professional Expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships) Preferred Technical And Professional Experience Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Show more Show less
Posted 1 month ago
5.0 - 8.0 years
11 - 18 Lacs
Bengaluru
Work from Office
Key Responsibilities: Design, develop, and maintain ETL processes using tools such as Talend, Informatica, SSIS, or similar. Extract data from various sources, including databases, APIs, and flat files, transforming it to meet business requirements. Load transformed data into target systems while ensuring data integrity and accuracy. Collaborate with data analysts and business stakeholders to understand data needs and requirements. Optimize ETL processes for enhanced performance and efficiency. Debug and troubleshoot ETL jobs, providing effective solutions to data-related issues. Document ETL processes, data models, and workflows for future reference and team collaboration. Qualifications: • Bachelor's degree in computer science, Information Technology, or a related field. 3-5 years of experience in ETL development and data integration. Experience with Big Data technologies such as Hadoop or Spark. Knowledge of cloud platforms like AWS, Azure, or Google Cloud and their ETL services. Familiarity with data visualization tools such as Tableau or Power BI. Hands-on experience with Snowflake for data warehousing and analytics
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Saras Analytics: We are an ecommerce focused end to end data analytics firm assisting enterprises & brands in data driven decision making to maximize business value. Our suite of work spans extraction, transformation, visualization & analysis of data delivered via industry leading products, solutions & services. Our flagship product is Daton, an ETL tool. We have now ventured into building exciting ease of use data visualization solutions on top of Daton. And lastly, we have a world class data team which understands the story the numbers are telling and articulates the same to CXOs thereby creating value. Where we are Today: We are a boot strapped, profitable & fast growing (2x y-o-y) startup with old school value systems. We play in a very exciting space which is intersection of data analytics & ecommerce both of which are game changers. Today, the global economy faces headwinds forcing companies to downsize, outsource & offshore creating strong tail winds for us. We are an employee first company valuing talent & encouraging talent and live by those values at all stages of our work without comprising on the value we create for our customers. We strive to make Saras a career and not a job for talented folks who have chosen to work with us. The Role: We are seeking an accomplished Lead Data Engineer with strong programming skills, cloud expertise, and in-depth knowledge of Big Query/Snowflake data warehousing technologies. As a key leader in our data engineering team, you will play a critical role in designing, implementing, and optimizing data pipelines, leveraging your expertise in programming, cloud platforms, and modern data warehousing solutions. Responsibilities: Data Pipeline Architecture: Lead the design and architecture of scalable and efficient data pipelines, ensuring optimal performance and reliability. Programming and Scripting: Utilize strong programming skills, particularly in languages like Python, for developing robust and maintainable data engineering solutions. Cloud Platform Expertise: Apply extensive experience with cloud platforms (e.g., AWS, Azure, Google Cloud) to design, deploy, and optimize data engineering solutions in a cloud environment. BigQuery/Snowflake Knowledge: Demonstrate deep understanding and hands-on experience with BigQuery/Snowflake for efficient data storage, processing, and analysis. ETL Processes: Lead the development of Extract, Transform, Load (ETL) processes, ensuring seamless integration of data from various sources into the data warehouse. Data Modeling and Optimization: Design and implement effective data models to support ETL processes and ensure data integrity and efficiency. Collaboration and Leadership: Collaborate with cross-functional teams, providing technical leadership and guidance to junior data engineers. Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver effective data solutions. Quality Assurance: Implement comprehensive data quality checks and validation processes to ensure the accuracy and completeness of data. Documentation: Create and maintain detailed documentation for data engineering processes, data models, and cloud configurations. Technical Skills: Programming Languages: Expertise in programming languages, with a strong emphasis on Python. Cloud Platforms: Extensive experience with cloud platforms such as AWS, Azure, or Google Cloud. Big Data Technologies: Proficiency in big data technologies and frameworks for distributed computing. Data Warehousing: In-depth knowledge of modern data warehousing solutions, with specific expertise in BigQuery/Snowflake. ETL Tools: Experience with ETL tools like Apache NiFi, Talend, or similar. SQL: Strong proficiency in writing and optimizing SQL queries for data extraction, transformation, and loading. Collaboration Tools: Experience using collaboration and project management tools for effective communication and project tracking. Soft Skills: Strong leadership and mentoring capabilities. Excellent communication and presentation skills. Strategic thinking and problem-solving abilities. Ability to work collaboratively in a cross-functional team environment. Educational Qualifications: Bachelor’s or Master’s degree in Computer Science Data Engineering, or a related field. Experience: 8+ years of experience in data engineering roles with a focus on programming, cloud platforms, and data warehousing. If you are an experienced Lead Data Engineer with a strong programming background, cloud expertise, and specific knowledge of BigQuery/Snowflake, we encourage you to apply. Please submit your resume and a cover letter highlighting your technical skills, leadership experience, and contributions to data engineering projects. Show more Show less
Posted 1 month ago
4.0 - 7.0 years
20 - 30 Lacs
Noida
Hybrid
Role & responsibilities Collaborate with customers' Business and IT teams to define and gather solution requirements for custom development, B2B/ETL/EAI, and cloud integration initiatives using the Adeptia Integration Platform. Analyze, interpret, and translate customer business needs into scalable and maintainable technical solution designs, aligned with best practices and the capabilities of the Adeptia platform. Storyboard and present solutions to customers and prospects, ensuring a clear understanding of proposed designs and technical workflows. Provide end-to-end project leadership, including planning, tracking deliverables, and coordinating efforts with offshore development teams as required. Review implementation designs and provide architectural guidance and best practices to the implementation team to ensure high-quality execution. Actively assist and mentor customers in configuring and implementing the Adeptia platform, ensuring alignment with technical and business objectives. Solutions Lead (Implementation Services Team) Full Time (Permanent) Noida, India 1 Offer expert recommendations on design and configuration to ensure successful deployment and long-term maintainability of customer solutions. Define clear project requirements, create work breakdown structures, and establish realistic delivery timelines. Delegate tasks effectively, and manage progress against daily, weekly, and monthly targets, ensuring the team remains focused and productive. Serve as a liaison among customers, internal stakeholders, and offshore teams to maintain alignment, track progress, and ensure delivery meets both quality and timeline expectations. Monitor project baselines, identify and mitigate risks, and lead participation in all Agile ceremonies, including sprint grooming, planning, reviews, and retrospectives. Maintain a hands-on technical role, contributing to development activities and conducting detailed code reviews to ensure technical soundness and optimal performance. Take full ownership of assigned projects, driving them to successful, on-time delivery with high quality standards. Preferred candidate profile Proven experience in designing and developing integration solutions involving Cloud/SaaS applications, APIs, SDKs, and legacy systems. Skilled in implementing SOA/EAI principles and integration patterns in B2B, ETL, EAI, and Cloud Integration using platforms such as Adeptia, Talend, MuleSoft or similar tools. Good hands-on experience with Core Java (version 8+) and widely-used Java frameworks including Spring (version 6+), Hibernate (version 6+). Proficient in SOA, RESTful and SOAP web services and related technologies including JMS, SAAJ, JAXP, and XML technologies (XSD, XPath, XSLT, parsing). Strong command over SQL and RDBMS (e.g., Oracle, MySQL, PostgreSQL). Solid understanding of Enterprise Service Bus (ESB) concepts and messaging technologies such as Kafka and RabbitMQ. Familiar with transport protocols including HTTPS, Secure FTP, POP/IMAP/SMTP, and JDBC. Skilled in working with Windows and Linux operating systems, and experienced with application servers such as JBoss, Jetty, and Tomcat. Solid understanding of security best practices, including authentication, authorization, data encryption, and compliance frameworks relevant to enterprise integrations. Basic understanding of modern JavaScript frameworks such as React, with the ability to collaborate effectively on front-end and full-stack development scenarios
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance. Show more Show less
Posted 1 month ago
1.0 years
0 Lacs
India
On-site
About Kinaxis: About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® CertifiedTM. Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. About the team: Location Chennai, India (Hybrid) About the team This team is responsible for supporting data integration activities throughout the deployment of Kinaxis solutions. The job incumbent has a foundational level of technical and domain knowledge and can navigate Kinaxis’ technology solutions and processes for data management and integration. The team understands Kinaxis customers’ most pressing supply chain product offerings so that our customers can start to experience the immediate value of Kinaxis solutions About the role: What you will do Participate in deep-dive business requirements discovery sessions and develop integration requirements specifications, with guidance from senior consultants. Demonstrate knowledge and proficiency in both the Kinaxis Maestro (RapidResponse) data model, and REST based API Integration capabilities, and support identifying and implementing solutions best suited to individual data flows, under the guidance of senior consultants. Assist in integration related activities including validation and testing of the solutions. Technologies we use Bachelor’s degree in Industrial Engineering, Supply Chain, Operations Research, Computer Science, Computer Engineering, Statistics, Information Technology or a related field. Passion for working in a collaborative team environment and able to demonstrate strong interpersonal, communication, and presentation skills. 1-3 years of experience in implementing or deploying software applications in the supply chain management space or experience in data integration activities for enterprise level systems. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Self-starter who shows initiative in their work and learning and can excel in a fast-paced work environment. Excellent problem-solving and critical thinking skills, able to synthesize a high volume of complex information to determine the best course of action. Works well in a team environment and can work effectively with people at all levels in an organization. Ability to communicate complex ideas effectively in English, both verbally and in writing. Ability to work virtually. What we are looking for Technical skills such as SQL, R, Java Script, Python, etc. Experience working with relational databases and Typescript, an asset. Experience working with Maestro authoring an asset. Experience working with supply chain processes and manufacturing planning solutions such as Maestro, SAP, Oracle, or Blue Yonder applications to support supply chain activities. Progressive experience with ETL tools such as Talend, OWB, SSISl SAP Data Services etc. Some database level experience extracting data from enterprise class ERP systems including SAP/APO, Oracle, and JDE. #Intermediate #LI-RJ1 #Hybrid Why join Kinaxis?: Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com. Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France