Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
10 - 15 Lacs
Bengaluru
Work from Office
2+ years of implementation experience with Adobe Experience Cloud products especially Adobe Experience Platform and Journey Optimizer Expertise in deploying, configuring and optimizing all major Experience Platform services and Journey Optimizer features Strong SQL skills for querying datasets, implementing data transformations, cleansing data, etc Hands-on experience developing custom applications, workflows and integrations using Experience Platform APIs Deep familiarity with Adobe Experience Platform and Journey Optimizer technical implementations including: Setting up source connectors and ingesting data using the API and UI Configuring Experience Events (XDM schemas) for capturing data from sources Creating audience segments using custom and AI/ML-based segments Triggering journeys and activations based on segments and profiles Implementing journey automations, actions and messages Integrating with destinations like CRM, email, etc. Hands-on experience developing using the Platform APIs and SDKs in a language like: JavaScript for API calls Java for extending functionality with custom connectors or applications Expertise in data modelling for multi-channel needs using the Experience Data Model (XDM) Familiarity with configuring IMS authentication to connect external systems to Platform APIs Experience developing, debugging and optimizing custom server-side applications Proficiency in JavaScript, JSON, REST APIs and SQL Expertise ingesting data from sources like Adobe Analytics, CRM, Ad Server, Web and Mobile Strong understanding of concepts like multi-channel data management, segmentation, orchestration, automation and personalization Understanding of data structures likeJSON for representing data in APIs, Data Tables for processing tabular data, Streaming data flows Experience automating technical tasks using tools likeAPI integration tests, Postman for API testing, Git for source control Excellent documentation and communication skills with the ability to clearly present technical recommendations to customers
Posted 1 hour ago
2.0 - 5.0 years
13 - 17 Lacs
Gurugram
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong MS SQL, Azure Databricks experience Implement and manage data models in DBT, data transformation and alignment with business requirements. Ingest raw, unstructured data into structured datasets to cloud object store. Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance Preferred technical and professional experience Establish best DBT processes to improve performance, scalability, and reliability. Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Databricks Proven interpersonal skills while contributing to team effort by accomplishing related results as required
Posted 1 hour ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
The candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. He/she must be able to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, he/she must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Role and responsibilities: Process mapping and identifying non value add steps / friction points in the process Discover, monitor and improve processes by extracting and analysing knowledge from the event logs in Process Mining/Celonis tool. Work alongside both technical and non-technical stakeholders to understand business challenges to help design process mining initiatives and prioritize the requests. Act as customers key contact and guide them through revealing process trends, inefficiencies & bottlenecks in the business process. Support validation of data (counts, values between source systems and Celonis). Work on process insights by creating KPIs and actions, identify process inefficiencies, and understand the root causes. Develop workflows to monitor processes, detect anomalies and turn those insights into real-time automated preventive or corrective actions using Action-engine, Action-flows and other capabilities. Technical and Functional Skills: Bachelors degree in Computer Science with 3+ years of work experience in Data Analytics, data mining & Data Transformation. Very proficient in Celonis, should be able to build, manage, and extract value from Celonis models for various use casesAdding or modifying data sources, Creating automated alerts, Action engine, Transformation center, Celonis ML workbench. Experience in SQL / PQL scripting & knowledge of data mining, should apply complex queries to build the transformation e.g. joins, union, windows functions etc. Knowledge of process improvement techniques / tools and Process Mining / Analytics. Basic knowledge of Python Scripting, should be knowing about (Numpy, Pandas, Seaborn, Matplotlib, SKLearn etc) Experience in BI tools (e.g., Tableau, Power BI etc.) Nice to have Strong communicationand presentation skills. Understanding of business processes.
Posted 2 hours ago
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Skills Data Testing Genrocket Experience 6+Years Location Mumbai, Pune, Banglore Job type Contract to HIRE 4-5 years of hands-on experience on GenRocket Tool. Strong SQL knowledge with ability to write complex queries (example Left, right joins etc..) 3. Strong knowledge in SQL server, MSSQL, Microsoft Azure, ADF and synapse for nata base validation 4. Intermediate knowledge on ETL Transformations, workflows, STTM mappings (source to Target Data mappings) 5. Strong Knowledge on powershell scripting. 6. Ability to test Data validation and Data Transformation from source to Target 7. Data validation Validating Data sources, Extracting data, and applying transformation logic 8. Test planning & Execution Defining testing scope, Prepares test cases and test conditions, Test Data preparation 9. Coordinating test activities with Dev, BA & DBA and conduct defect triage for resolution of issues 10. Test quality Ensuring the quality of their work and the work of the development team 11. QA Test documentation Creating and maintaining documentation of test plans, test deliverables document such as QA Estimates, RTM (requirement traceability matrix, Peer reviews, QA sign-off documents. 12. Hands experience working on ADO/JIRA for test management, reporting defects and Dashboards creation. 13. Ability to Identify and report the risks and provide the mitigation plans. Coordinate with internal and external teams for completion of activities.
Posted 2 hours ago
6.0 - 9.0 years
27 - 42 Lacs
Pune
Work from Office
About the role As a Big Data Engineer, you will make an impact by identifying and closing consulting services in the major UK banks. You will be a valued member of the BFSI team and work collaboratively with manager, primary team and other stakeholders in the unit. In this role, you will: Collaborate with cross-functional teams to improve data ingestion, transformation, and validation workflows Work closely with Data Engineers, Architects, and Analysts to understand data reconciliation requirements Develop and implement PySpark programs to process large datasets in Big data platforms Analyze and comprehend existing data ingestion and reconciliation frameworks Perform complex transformations including reconciliation and advanced data manipulations Fine-tune Spark jobs for performance optimization, ensuring efficient data processing at scale Work model We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 3 days a week in a client or Cognizant office in Pune/Hyderabad location. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs. What you must have to be considered Design and implement data pipelines, ETL processes, and data storage solutions that support data-intensive applications Extensive hands-on experience with Python, PySpark Good at Data Warehousing concepts & well versed with structured, semi structured (Json, XML, Avro, Parquet) data processing using Spark/Pyspark data pipelines Experience working with large-scale distributed data processing, and solid understanding of Big Data architecture and distributed computing frameworks Proficiency in Python and Spark Data Frame API, and strong experience in complex data transformations using PySpark These will help you stand out Able to leverage Python libraries such as cryptography or pycryptodome along with PySpark's User Defined Functions (UDFs) to encrypt and decrypt data within your Spark workflows Should have worked on Data risk metrics in PySpark & Excellent at Data partitioning, Z-value generation, Query optimization, spatial data processing and optimization Experience with CI/CD for data pipelines Must have working experience in any of the cloud environment AWS/Azure/GCP Proven experience in an Agile/Scrum team environment Experience in development of loosely coupled API based systems We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.
Posted 3 hours ago
6.0 - 9.0 years
1 - 5 Lacs
Hyderabad
Hybrid
Immediate Openings for Business analyst Data analyst ( Fin Crime ) - PUNE - Contract to hire Experience: 6+ Years Skill: Business Analyst, Financial Crime. Location: PAN INDIA Notice Period: Immediate Employment Mode: Contract to hire Working Mode: Hybrid Job Description: Financial Crime knowledge is highly desirable but less critical to the role. Knowledge of managing customer data across a complex environment with multiple customer masters is critical Understanding the challenges of a distributed data environment, how to navigate it and how to effectively consolidate it Some experience in implementation planning (via solution design iterations) in relation to new complex data models Ability to communicate well with senior stakeholders, including influencing and adoption of ideas. Hands on DA - good at data systems, dq, data transformation, dig in and investigate Understand lineage, data quality, data criticality and governance Data model implementation strategy
Posted 3 hours ago
6.0 - 11.0 years
5 - 9 Lacs
Hyderabad, Bengaluru
Work from Office
Skill-Snowflake Developer with Data Build Tool with ADF with Python Job Descripion: We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage. Key Responsibilities Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. Implement data transformations, testing, and documentation using dbt. Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. Troubleshoot data-related issues. (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications Experience in Data Warehousing with a strong understanding of best practices. Hands-on experience with Snowflake (Data Modeling, Query Optimization). Proficiency in Azure Data Factory (ADF) for data pipeline development. Strong working knowledge of dbt (Data Build Tool) for data transformations. (Optional) Experience in Python scripting for automation and data manipulation. Good understanding of SQL and query optimization techniques. Experience in cloud-based data solutions (Azure). Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering.
Posted 3 hours ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP CPI for Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP CPI for Data Services.- Strong understanding of application development methodologies.- Experience with integration tools and techniques.- Familiarity with data transformation and migration processes.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in SAP CPI for Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 21 hours ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Manual Testing Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to navigate complex data environments, providing insights and recommendations that drive effective data management and governance practices. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantice layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 21 hours ago
4.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Minimum 3.5 years experience with Having experience on end-to-end Mulesoft Integration (Anypoint platform) experience with various systems/applications SAAS, Legacy system, DB, Webservices(SOAP & REST) Knowledge on integration design patterns Hands on experience in using Mule connectors like Salesforce, FTP, FILE, sFTP, IMAP, Database, HTTP etc. Experience in developing middle tier applications using ESB Mule ( API and batch processing ) Experience in RDBMS SQL queries , functions & Stored procedure Strong knowledge in data transformations using Mulesoft Dataweave and exception handling . Hands on experience with Mule 4, RAML 1.0, Maven, MUnits current version of Mulesoft Anypoint studio, Anypoint platform in a cloud implementation or On-prem or Runtime Fabric Security , Logging , Auditing, Policy Management and Performance Monitoring and KPI for end-to-end process execution Experience with Mulesoft, Java integration Basic knowledge on java Intermediate level knowledge in working with Web-services technologies ( XML, SOAP, REST, XSLT ) and CLOUD API Basic knowledge on Salesforce in a Cloud Implementation Other Qualifications Familiarity with Agile (Scrum) project management methodology nice to have Familiarity with Salesforce in a cloud implementation Familiarity with Microsoft Office suite including Visio, draw.io If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 22 hours ago
3.0 - 7.0 years
11 - 16 Lacs
Gurugram
Work from Office
Project description We are looking for the star Python Developer who is not afraid of work and challenges! Gladly becoming a partner with famous financial institution, we are gathering a team of professionals with wide range of skills to successfully deliver business value to the client. Responsibilities Analyse existing SAS DI pipelines and SQL-based transformations. Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. Develop and maintain scalable ETL pipelines using Python on AWS EMR. Implement data transformation, cleansing, and aggregation logic to support business requirements. Design modular and reusable code for distributed data processing tasks on EMR clusters. Integrate EMR jobs with upstream and downstream systems, including AWS S3, Snowflake, and Tableau. Develop Tableau reports for business reporting. Skills Must have 6+ years of experience in ETL development, with at least 5 years working with AWS EMR. Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python for data processing and scripting. Proficient in SQL and experience with one or more ETL tools (e.g., SAS DI, Informatica)/. Hands-on experience with AWS servicesEMR, S3, IAM, VPC, and Glue. Familiarity with data storage systems such as Snowflake or RDS. Excellent communication skills and ability to work collaboratively in a team environment. Strong problem-solving skills and ability to work independently. Nice to have N/A Other Languages EnglishB2 Upper Intermediate Seniority Senior
Posted 1 day ago
5.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
6+Years experience with solution scoping, development and delivery with Winshuttle Studio, Composer and Foundation Should have architect level experience data management (Integration/migration and governance) Should have experience in SAP implementation with Winshuttle as data load tool into SAP S4/HANA. Should have extensively worked on Winshuttle Transaction, Winshuttle Query and Winshuttle Direct. Responsible throughout design, development, testing, loading and reconciliation. Experience with SAP Data Migration tools, ETL and other Data migration tools is an advantage. Should be hands-on with development techniques as well. Knowledge of data structures in SAP FI/SD/MM modules and dependencies is a must. Responsible to work on data migration documentation. cut-over planning. Experience with leading a team across different process areas, geographies, work forces and technical expertise. Coordination with Project team stakeholders Business / Data Owners, SAP Functional teams, SAP ABAP / Basis teams, ETL teams, Legacy data team etc. Should have rich experience in Data transformation Procedures, Post migration data validation procedures, Data retention, Data security procedures.
Posted 1 day ago
5.0 - 9.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Summary: We are seeking a skilled and motivated Workday Developer with 3+ years of experience and a strong background in Workday Studio, Core Connectors, EIBs, and web service APIs. The ideal candidate will be responsible for designing, building, and reviewing Workday integrations in collaboration with Workday Functional Leads and business SMEs. This role is critical in delivering scalable, secure, and efficient solutions aligned with evolving business needs. Key Responsibilities: Understand business and functional requirements; design, build, test, and deploy Workday integrations using Studio, EIB, Core Connectors, and Workday APIs. Analyze and translate functional specifications and change requests into detailed technical specifications. Develop and maintain calculated fields, condition rules, business processes, custom reports, and security configurations. Collaborate with functional leads and stakeholders to deliver end-to-end technical solutions. Troubleshoot integration issues, perform root cause analysis, and provide ongoing production support. Document system design, integration maps, deployment strategies, and change logs. Assess the impact of Workday updates on existing integrations and configurations. Support data migrations, audits, and compliance reporting activities. Review and validate reports to ensure accuracy, security, and data privacy compliance. Train end-users on Workday reporting tools and available dashboards. Adhere to established standards, documentation protocols, and change management processes. Ensure compliance with Workday security and governance policies in all development activities. Required Qualifications and Experience: Bachelors degree in Computer Science, Information Systems, MIS, or a related field (preferred). 3+ years of hands-on Workday integration development experience. Proficiency in Workday Studio, EIB, Core Connectors, Web Services (SOAP/REST), and XSLT. Strong understanding of Workday HCM modules such as Payroll, Benefits, Time Tracking, and Compensation. Experience with Workday Report Writer and Calculated Fields. Proficient in XML, XSLT, JSON, and data transformation technologies. Ability to interpret business needs into technical solutions. Strong analytical and problem-solving skills with the ability to thrive in a fast-paced environment. Preferred Qualifications: Workday Integration Certification (Studio or Core Connectors). Experience working in Agile or Scrum-based development environments. Exposure to Workday Prism, Extend, or People Experience. Knowledge of HR business processes and compliance standards such as SOX and GDPR. Technical / Soft Skills: Quick learner with a strong aptitude for mastering report writing tools Familiarity with finance-related reporting Proficiency in Microsoft Office tools (Word, Excel, PowerPoint) Ability to consolidate data from multiple sources into comprehensive reports Strong problem-solving, troubleshooting, and analytical abilities Self-motivated with the ability to prioritize and execute tasks independently Demonstrated ability to meet deadlines and manage multiple priorities Excellent communication, presentation, and stakeholder management skills Work Environment: Hybrid work setup with 34 days required in the office each week Collaborative culture with continuous professional development and learning opportunities
Posted 1 day ago
2.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.
Posted 1 day ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.
Posted 1 day ago
2.0 - 5.0 years
5 - 9 Lacs
Mumbai
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.
Posted 1 day ago
2.0 - 5.0 years
5 - 9 Lacs
Kolkata
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.
Posted 1 day ago
9.0 - 14.0 years
17 - 30 Lacs
Noida
Hybrid
We are looking for a skilled Telecom Billing Mediation Specialist to manage and optimize the mediation process between network elements and the postpaid billing system. Connect with me over LinkedIn : https://www.linkedin.com/in/nitin-tushir-abc0048/ The ideal candidate will have a strong background in telecom mediation platforms, CDR (Call Detail Records) processing, billing integration, and data transformation . This role involves ensuring seamless data collection, processing, and delivery to downstream billing and revenue management systems. What you will do: Mediation System Management: Configure, monitor, and troubleshoot mediation systems for postpaid billing. Ensure accurate and timely collection, aggregation, and transformation of CDRs from multiple network elements. Implement rules for data filtering, deduplication, and enrichment before sending to the billing system. Integration & Optimization: Work with network, IT, and billing teams to ensure smooth integration between mediation and billing platforms. Optimize mediation rules to handle high-volume CDR processing efficiently. Perform data reconciliation between network elements, mediation, and billing systems. Issue Resolution & Performance Monitoring: Investigate and resolve discrepancies in mediation and billing data. Monitor system health, troubleshoot issues, and ensure high availability of mediation services. Conduct root cause analysis (RCA) for mediation-related issues and implement corrective actions. Compliance & Reporting: Ensure adherence to regulatory, audit, and revenue assurance requirements. Generate reports on mediation performance, errors, and processed CDR volumes. Support fraud management and revenue assurance teams by providing mediation-related insights. You will bring: Technical Skills: Hands-on experience with billing mediation platforms (e.g. Amdocs Mediation, IBM, HP Openet, etc.) Proficiency in SQL, Linux/Unix scripting, and data transformation tools. Strong understanding of CDR structures, mediation rules configuration . Familiarity with ETL processes, data parsing, and API integrations . Domain Knowledge: Solid understanding of telecom postpaid billing systems (e.g., Amdocs, HP, Oracle BRM). Knowledge of network elements (MSC, MME, SGSN, GGSN, PCRF, OCS, IN) and their impact on mediation. Awareness of revenue assurance and fraud detection in telecom billing. Soft Skills: Strong problem-solving and analytical skills. Ability to work in a cross-functional team and communicate effectively with stakeholders. Key Qualification: Bachelors degree in computer science, E.C.E Telecommunications. 10+ years of experience in telecom billing mediation. Experience in cloud-based mediation solutions (AWS, Azure, GCP) is a plus. Knowledge of 5G mediation and real-time charging architectures is an advantage.
Posted 1 day ago
4.0 - 6.0 years
3 - 5 Lacs
Bengaluru
Work from Office
Job Overview: We are looking for an individual to join the Middle Office - Calypso Platform team as a System Analyst. The person should have clear understating about the system and its functionality with real hands on to perform setups to onboard clients/funds for trading, settlement and PnL report generations (non-operations) This role requires a solid understanding of Capital Markets across both Exchange and OTC markets. The ideal candidate should have at least 4 years of hands-on experience with platforms preferably - Calypso, and a total experience not exceeding 12 years. The role involves supporting various Middle Office functions, particularly Funds Onboarding, while collaborating closely with Client Service Managers in regions including Europe, Asia, and the US. Key Responsibilities: 1. Provide daily support for the Calypso platform, assisting Middle Office (MO) users. 2. Manage fund onboarding tasks and related responsibilities. 3. Help develop Standard Operating Procedures (SOPs) for Operations. 4. Track and resolve issues related to the platform in a timely manner. 5. Assist with tasks related to trade booking, allocations, matching, settlement, and related activities. 6. Engage with both Front Office and Back Office setup, ensuring smooth workflow and message handling. 7. Collaborate with the team to ensure timely and accurate resolution of market data, trade processing, and scheduled tasks. 8. Work with tools such as Jira, Confluence, and Excel (including VBA, Macros, etc.) for process optimization and tracking. Required Skills & Experience: 1. 4-12 years of experience in financial services, with direct experience using the Calypso platform or a similar system FO & BO setups. 2. Expertise in asset classes such as Commodities, Equity Derivatives, Credit Derivatives, Exotic Structures, Fixed Income, Futures, FX (Foreign Exchange), FX Options, Interest Rate Derivatives, and Money Markets. 3. Knowledge of financial products including Bonds, Repo, TRS, Equity Swaps, CDS, CDX, Futures, Options, and Equities, along with their confirmation, settlement, and P&L tracking. 4. Strong data transformation and analysis skills. 5. Proficiency in advanced Excel functions, including VBA and Macros. 6. Excellent problem-solving abilities with critical and objective thinking. 7. Outstanding communication and interpersonal skills to work effectively with teams across different regions. Qualifications: 1. Postgraduate degree in Commerce, MBA in Finance, or professional qualifications like CA/CMA/CFA.
Posted 2 days ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values diverse voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from diverse backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Pyramid overview A role with Target Data Science & Engineering means the chance to help develop and manage state of the art predictive algorithms that use data at scale to automate and optimize decisions at scale. Whether you join our Statistics, Optimization or Machine Learning teams, youll be challenged to harness Targets impressive data breadth to build the algorithms that power solutions our partners in Marketing, Supply Chain Optimization, Network Security and Personalization rely on. Team Overview Global Supply Chain and Logisticsat Target is evolving at an incredible pace. We are constantly reimagining how we get the right product to the guests better, faster and more cost effectively than before across 1900 locations Our Supply Chain Data Science team oversees the development of state-of-the-art mathematical techniques to help solve important problems for Targets Supply Chain e.g. identifying the optimal quantities and positioning of inventory across multiple channels and locations, planning for the right mix of inventory investments vs guest experience, Digital order Fulfillment planning, transportation resource planning, etc. As a Data Scientist in Digital fulfillment Planning space , you will have the opportunity to work with Product, Tech, and business partners to solve retail challenges at scale for our fulfillment network. Position Overview As a Senior Data Scientist at Target you will get an opportunity design,develop, deploy and maintain data science models and tools. Youll work closely with applied data scientists, data analysts and business partners to continuously learn and understand evolving business needs. Youll also collaborate with engineers and data scientists onpeerteams to buildand productionizefulfillment solutions for our supply chain/logistics needs.In this role as a Senior Data Scientist, you will: Develop a strong understanding of business and operational processes within Targets Supply chain. Develop an in-depth understanding of the various systems and processes that influence Digital order fulfillment speed & costs. Analyze large datasets for insights leading to business process improvements or solution development. Work with the team to build and maintain complex software systems and tools. Add new capabilities and features to the simulation framework to reflect the complexities of an evolving digital fulfillment network. Develop and deploy modules to run simulations for testing and validating multiple scenarios to evaluate the impact of various fulfillment strategies. Adopt modular architecture and good software development/engineering practices to enhance the overall product performance and guide other team members. Produce clean, efficient code based on specifications. Enhance and maintain simulation environment to enable testing/deploying new features for running custom, user defined scenarios through the simulator. Coordinate the analysis, troubleshooting and resolution of issues in the models and software. Ensure your Target Career Profile is up to date before applying for a position. As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Our Supply Chain Data Science team oversees the development of state-of-the-art mathematical techniques to help solve important problems for Targets Supply Chain e.g., identifying the optimal quantities and positioning of inventory across multiple channels and locations, planning for the right mix of inventory investments vs guest experience, digital order fulfillment planning, transportation resource planning, etc. As a Senior Data Scientist in Digital fulfilment space, you will have the opportunity to work with Product, Tech, and Business Partners to solve retail challenges at scale for our fulfillment network. Position Overview: As aSenior Data Scientist at Target, you will get an opportunity design,develop, deploy, and maintain data science models and tools. Youll work closely with applied data scientists, data analysts and business partners to continuously learn and understand evolving business needs. Youll also collaborate with engineers and data scientists onpeerteams to buildand productionizefulfillment solutions for our supply chain/logistics needs. Develop a strong understanding of business/ operational processes within Targets supply chain. Develop an in-depth understanding of the various systems and processes that influence digital order fulfillment speed & costs. Develop optimization-based solutions, approximate mathematical models (probabilistic/deterministic models) of real-world phenomena, predictive models and implement the same in real production systems with measurable impact. Analyze large datasets for insights leading to business process improvements or solution development. Develop and deploy modules for testing and validating multiple scenarios to evaluate the impact of various fulfillment strategies. Adopt modular architecture and good solution development practices to enhance the overall product performance and guide other team members. Produce clean, efficient code based on specifications. Coordinate the analysis, troubleshooting and resolution of issues in the models and software. About You: Must Haves: We are looking for candidates who meet the following criteria: Bachelors/Masters/PhD in Mathematics, Statistics, Operations Research, Industrial Engineering, Computer Science, or a related quantitative field. 3+ years of direct, hands-on experience building optimization models (e.g., linear, mixed-integer, network flow, or combinatorial). Strong proficiency in Python and Spark for developing and deploying optimization-based solutions. Solid understanding of operations research techniques and their application in real-world business problems . Demonstrated experience working on end-to-end solution delivery , preferably with production-grade implementation. Strong verbal and written communication skillsable to translate technical detail into business insight, and vice versa. Strong analytical thinking, data transformation, and problem-solving ability, especially under ambiguity. Team player with the ability to collaborate effectively across cross-functional, geographically distributed teams. Preferred Experience: Experience building large scale optimization models/ ability to build models with smart heuristic solutions Familiarity with supply chain or e-commerce fulfillment data and business processes. Experience working with large datasets. Experience deploying solutions at scale for business impact. Background in clean code practices, version control (Git), and collaborative development environments. Know More here: Life at Target - https://india.target.com/ Benefits - https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging
Posted 2 days ago
3.0 - 7.0 years
5 - 9 Lacs
Kolkata, Hyderabad, Pune
Work from Office
ETL QA tester1 Job Tile ETL QA tester Job Summary: We are looking for an experienced ETL Tester to ensure the quality and integrity of our data processing and reporting systems. The ideal candidate will have a strong background in ETL processes, data warehousing, and experience with Snowflake and Tableau. This role involves designing and executing test plans, identifying and resolving data quality issues, and collaborating with development teams to enhance data processing systems. Key Responsibilities: Design, develop, and execute comprehensive test plans and test cases for ETL processes. Validate data transformation, extraction, and loading processes to ensure accuracy and integrity. Perform data validation and data quality checks using Snowflake and Tableau. Identify, document, and track defects and data quality issues. Collaborate with developers, business analysts, and stakeholders to understand requirements and provide feedback on data-related issues. Create and maintain test data, test scripts, and test environments. Generate and analyze reports using Tableau to validate data accuracy and completeness. Conduct performance testing and optimization of ETL processes. Develop and maintain automated testing scripts and frameworks for ETL testing. Ensure compliance with data governance and security standards. Location - Pune,Hyderabad,Kolkata,Chandigarh
Posted 2 days ago
3.0 - 8.0 years
8 - 11 Lacs
Bengaluru
Work from Office
Education Qualification: Bachelors degree in Computer Science or related field or higher with minimum 3 years of relevant experience. Position Description: Experience in design and implementation of data pipelines using DataStage with strong SQL and decent Unix knowledge. Should able to - create, review and understand technical and requirement documents - participate in walkthrough reviews of technical specifications, programs, code and unit testing - plan design and implementation ensuring solution quality - estimate tasks/time require to perform design, coding and unit testing - ensure highest quality standards are followed in the ETL pipelines designs and follow best practices - perform detailed/technical analysis, coding, validation and documentation of new features - perform data transformations, analysis, performance tunings and optimizations , etc. Required qualifications to be successful in this role: Must-Have Skills: Strong analytical and problem-solving skills. ETL developer(IBM Datastage) Additional Skills - Teradata . DB2 , SQL and Unix Optional - SAS SIS Excellent communication and collaboration abilities. Ability to work in a fast-paced, dynamic environment with minimal supervision. Attention to detail and a commitment to data accuracy.Good-to-Have Skills: Knowledge of - SDLC: Agile mostly, Data warehousing techniques, data modeling, Git, CI/CD pipelines, etc. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Life at CGI: It is rooted in ownership, teamwork, respect and belonging. Here, you ll reach your full potential because Your work creates value. You ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise You ll shape your career by joining a company built to grow and last. You ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons Skills: Data Migration ETL Teradata Unix
Posted 2 days ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services, Microsoft Dynamics AX Technical Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for a Team Lead - Migration Engineer with deep expertise in Microsoft Dynamics 365 Finance & Operations (D365 F&O) to drive successful data migration, system upgrades, and platform transitions. The ideal candidate will be responsible for leading migration projects, ensuring data integrity, optimizing migration processes, and collaborating with teams to ensure seamless transitions.Key Responsibilities:Lead end-to-end migration projects for transitioning legacy systems to Microsoft Dynamics 365 F&O.Develop migration strategies, roadmaps, and execution plans for seamless data transfer.Design and implement ETL processes to ensure accurate and efficient data migration.Collaborate with technical teams to configure data mappings, transformations, and validation rules.Ensure compliance with Microsoft best practices for data migration and security.Conduct data integrity checks, validation, and reconciliation post-migration.Provide guidance and mentorship to a team of migration engineers, developers, and consultants.Troubleshoot migration-related issues and optimize performance for large-scale data transfers.Stay updated with the latest D365 F&O upgrades, tools, and methodologies related to data migration.Required Skills & Qualifications:Proven experience in Microsoft Dynamics 365 F&O migration projects.Strong knowledge of data architecture, ETL tools, and integration frameworks.Expertise in SQL, Azure Data Factory, and other data transformation tools.Hands-on experience with data cleansing, mapping, validation, and reconciliation.Ability to lead and manage teams in large-scale migration projects.Excellent analytical and problem-solving skills.Strong communication and stakeholder management abilities.Preferred Certifications:Microsoft Certified:Dynamics 365 Finance Functional Consultant AssociateMicrosoft Certified:Dynamics 365 Data Migration & Integration SpecialistMicrosoft Certified:Azure Data Engineer Associate Qualification 15 years full time education
Posted 3 days ago
8.0 - 13.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 3 days ago
8.0 - 13.0 years
14 - 19 Lacs
Coimbatore
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane