Home
Jobs

16305 Flow Jobs - Page 18

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Us We are brand builders who focus our passion and creativity to build Calvin Klein and TOMMY HILFIGER into the most desirable lifestyle brands in the world and at the same time position PVH as one of the best-performing brand groups in our sector. Guided by our values and enabled by our scale and global reach, we are driving fashion forward for good, as one team with one vision and one plan. That’s the Power of Us, that’s the Power of PVH+. One of PVH’s greatest strengths is our people. Our collective desire is to create a workplace environment where every individual is valued, and every voice is heard, and we are committed to fostering an inclusive and diverse community of associates with a strong sense of belonging. Learn more about Inclusion & Diversity at PVH here . Position Summary The Billings and Collections Analyst is responsible for supporting the end-to-end billing process and performing collection activities to ensure timely receipt of customer payments. This role combines accurate and timely invoice approval/release with proactive collection follow-ups to support cash flow and customer satisfaction. The analyst works closely with internal teams and customers to resolve billing discrepancies and reduce past-due balances. Primary Responsibilities/Accountabilities Of The Job Review and release customer orders based on credit limits. Request pro-forma invoices to send to account for cash in advance customers Perform collection activities on assigned customer accounts via email communication. Track, document, and follow up on past due balances and payment commitments. Investigate and resolve billing issues and short payments in coordination with internal teams. Participate in month-end and quarter-end reporting for billing status and aging analysis. Assist with customer account reconciliations and respond to external and internal queries. Adhere to company billing and AR policies, including compliance with internal controls. Supervisory Responsibilities Direct: n/a Indirect: n/a Budgetary Responsibilities No direct budget ownership. Indirect impact on working capital and Days Sales Outstanding (DSO) through effective collections performance. Decision Making Decide when to initiate reminder communications or escalate unresolved invoices to the appropriate parties. Determine whether billing discrepancies require corrections or escalation to leadership. Recommend prioritization of follow-up actions based on aging and risk indicators. Resourcefulness/Creativity Demonstrate initiative in identifying missing documentation or system discrepancies. Leverage ERP tools and customer systems to investigate invoice or payment issues. Flexibly adapt to dynamic customer behaviors and internal requests. Manage multiple billing formats and customer requirements in a high-volume environment. Environment Hybrid work model. Flexibility to work outside standard hours periodically to support global counterparts. Qualifications & Experience Experience: 0–3 years of experience in collections or accounts receivable. Familiarity with ERP systems such as SAP, Oracle, or equivalent is preferred. Education: Bachelor’s Degree in Accounting, Finance, or related field. Skills: Working knowledge of Microsoft Excel Ability to work effectively across time zones and international teams. Strong communication skills. Attention to detail and ability to work independently as well as in a team. PVH Corp. or its subsidiary ("PVH") is an equal opportunity employer and considers all applicants for employment on the basis of their individual capabilities and qualifications without regard to race, ethnicity, color, sex, gender identity or expression, age, religion, national origin, citizenship status, sexual orientation, genetic information, physical or mental disability, military status or any other characteristic protected under federal, state or local law. In addition to complying with all applicable laws, PVH is also committed to ensuring that all current and future PVH associates are compensated solely on job-related factors such as skill, ability, educational background, work quality, experience and potential. Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Data Engineer Location: Bangalore About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As a Data Engineer on our newly formed Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. You will play a critical role in the implementation of Data Warehousing and Data Lake solutions. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a dedicated team, leveraging your skills in the data engineering area to build solutions and drive innovation forward. ”. What You’ll Contribute Perform hands-on analysis, technical design, solution architecture, prototyping, proofs-of-concept, development, unit and integration testing, debugging, documentation, deployment/migration, updates, maintenance, and support on Data Platform technologies. Design, develop, and maintain robust, scalable data pipelines for batch and real-time processing using modern tools like Apache Spark, Kafka, Airflow, or similar. Build efficient ETL/ELT workflows to ingest, clean, and transform structured and unstructured data from various sources into a well-organized data lake or warehouse. Manage and optimize cloud-based data infrastructure on platforms such as AWS (e.g., S3, Glue, Redshift, RDS) or Snowflake. Collaborate with cross-functional teams to understand data needs and deliver reliable datasets that support analytics, reporting, and machine learning use cases. Implement and monitor data quality, validation, and profiling processes to ensure the accuracy and reliability of downstream data. Design and enforce data models, schemas, and partitioning strategies that support performance and cost-efficiency. Develop and maintain data catalogs and documentation, ensuring data assets are discoverable and governed. Support DevOps/DataOps practices by automating deployments, tests, and monitoring for data pipelines using CI/CD tools. Proactively identify data-related issues and drive continuous improvements in pipeline reliability and scalability. Contribute to data security, privacy, and compliance efforts, implementing role-based access controls and encryption best practices. Design scalable architectures that support FICO’s analytics and decisioning solutions Partner with Data Science, Analytics, and DevOps teams to align architecture with business needs. What We’re Seeking 7+ years of hands-on experience as a Data Engineer working on production-grade systems. Proficiency in programming languages such as Python or Scala for data processing. Strong SQL skills, including complex joins, window functions, and query optimization techniques. Experience with cloud platforms such as AWS, GCP, or Azure, and relevant services (e.g., S3, Glue, BigQuery, Azure Data Lake). Familiarity with data orchestration tools like Airflow, Dagster, or Prefect. Hands-on experience with data warehousing technologies like Redshift, Snowflake, BigQuery, or Delta Lake. Understanding of stream processing frameworks such as Apache Kafka, Kinesis, or Flink is a plus. Knowledge of data modeling concepts (e.g., star schema, normalization, denormalization). Comfortable working in version-controlled environments using Git and managing workflows with GitHub Actions or similar tools. Strong analytical and problem-solving skills, with the ability to debug and resolve pipeline and performance issues. Excellent written and verbal communication skills, with an ability to collaborate across engineering, analytics, and business teams. Demonstrated technical curiosity and passion for learning, with the ability to quickly adapt to new technologies, development platforms, and programming languages as needed. Bachelor’s in computer science or related field Exposure to MLOps pipelines MLflow, Kubeflow, Feature Stores is a plus but not mandatory Engineers with certifications will be preferred Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us Othain Group is a global IT and BP Solutions and Services Company The Group’s main focus is in the business process and technology management space, offering a broad portfolio of industry-specific services. With deep process knowledge and insights, with focused IT capabilities, targeted analytics and pragmatic reengineering—the company delivers a comprehensive client solution. Othain group believes in delivering extra ordinary customer care and solutions to customers and clients. Each contact with the customer is seen as an opportunity to enhance relationship and create value for the customer. We are looking for a competent Collection Agent to contact clients and collect outstanding payments. You will strike a balance between maintaining trustful relationship and ensuring timely payments. Our Collection Agent should exhibit professionalism and trustworthiness. You should have excellent communication and negotiation skills, as well as an ability to work independently. Job Location: Hyderabad (Work from office) Job Timing:- 5.30 PM to 2.30 AM IST Experience:- 2-6 years  Responsibilities Ensures that all billings are received by clients in a timely and proper manner. Maintain current credit information, current collection, and follow-up notes on all assigned accounts. Escalate disputed invoices or issues in a timely manner to the appropriate individual. To ensure cash flow by efficient collections and performing credit analysis, including the analysis of the relevant reports and financial statements to maximise profits. Analyse customer payments for cash application personnel and ensuring that the cash received from payments are correctly allocated. Monitors the aging of clients’ transactions. Determines accounts with accumulating overdue and take the necessary steps to collect dues immediately. Maintains an accurate and up-to-date record of open accounts of clients. Identifies the reason for credit hold and justification for credit lift. Ensuring that all necessary follow-ups and coordination with client and other departments in the company has been done to resolve collection issues (wrong billing, no PO, wrong discount, long overdue transactions, lost invoice, and the like) May do other functions related to the collection as the need arises. Issue weekly reminder emails and phone customers as required and maintain Customer contact data base. Should be able to communicate effectively with customers for collections and recovery Job Skills & Qualifications Graduate/Post Graduate in Commerce/Finance With a minimum of 2-3 years of working experience in a Credit Control & Collections environment in a service organization with customer bases. Working Knowledge of ERP(Dynamics 365, Bectran, GetPaid will be an added advantage Knowledgeable in basic accounting. Computer skills and familiarity in Excel and MS application Good communication skills both written and verbal. Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview In Scope of Position based Promotions (INTERNAL only) Job Title: Accounting Control Senior Analyst Corporate Title: Associate Location: Pune, India Role Description Deutsche Bank’s Finance division oversees the financial performance of the Bank. We advise senior management on the financial performance of all the areas of the bank. Finance is also involved in initiatives to help to lower costs, manage risk and improve performance. We deliver information to our shareholders, creditors, tax authorities, regulatory authorities, and auditors. The Associate – Accounting Close plays a pivotal role between their team, Senior Management and Internal Stakeholders. This role is specifically for the India region, has oversight responsibility for the integrity (completeness and accuracy) of financial statement identified legal entities of DB. This requires collaboration with the Regional Finance teams and lines of business controllers that have activity in that division / legal entity, to ensure those divisions/ legal entity’s implications (e.g., accounting policy, regulatory reporting, and capital) of transactions are considered. Overview The Accounting close role encompasses the following key functions: Closing and Financial reporting including Complex disclosures Understanding of legal entity financial information Consolidation of financials as per Group Reporting policy Maintaining SOX documentation Performance of Management Review Process (MRP) Understand the complete end-to-end process flow from month-end journals to and identify exceptions Financial analysis Managing the day-to-day relationship with key stakeholders across locations to ensure strong working partnership and build up a collaborative model Providing updates to Senior Management on the production process and highlight key risks in a timely manner Support for Regulatory reviews and Audits Driving change i.e. engagement in Redesign of systems and processes Looking at simplification & standardization of processes Knowledge of IFRS and upcoming changes in IFRS & implications there of Managing internal & external audits Work closely Regional Finance and Sourcing teams. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Functional responsibilities Independently manage the accounting and reporting for certain legal entities (including validation and control function of monthly financials and IFRS reporting to Head Office) Responsible for Balance sheet account substantiation on monthly basis Support external audit work on annual basis. Also, assist the regulatory filings / assessment process related to income tax, transfer pricing and service tax/ GST Support internal audit work as and when required Support other financial reporting requirements such as risk forums, ALCO, Board meetings by way of understanding the requirements of stakeholders, preparation of applicable decks etc. Coordination, Preparation and submission of cash flows on regular frequency for various requirements Lead from Finance perspective, any process review/re-design pertaining to such entities necessitated by any process improvement idea/regulatory requirement Work in close coordination with Finance Service centres at Pune and Manila Ability to develop and leverage relationships with multiple teams within organization for delivery on business goals Working knowledge of SAP and ability to understand the complex system architecture and platforms Prior experience of process migrations would be added advantage. Position would be based at Pune and requires travel within India Your Skills And Experience The candidate must be a highly motivated and high performing individual. Candidate must be able to handle all levels of complexity in their product coverage or area under control, be able to multi task with relative ease and be flexible enough in shifting workload in accordance with changing priorities, and be comfortable dealing with a sometimes stressful and fast-paced month end priority-driven environment. Candidate is expected to have demonstrated experience of working with multiple teams in a matrix organization 5+ years working experience (preferably in Finance teams of banks, securities firms, investment banks or professional accounting / audit firms or in a similar capacity in a BPO / KPO center). Prefer knowledge of trading products, their valuations and control processes. Prior experience in a controllership role would be highly valuable Soft Skills Communication Ability to communicate effectively (oral & written) Analytical Abilities Displays a high degree of control awareness Attention to detail and big picture view Strong analytical / business problem-solving skills Time Management Skills Well organized & able to logically present results of work Ability to work under pressure and to deadlines Ability to manage own time Drive and Motivation Passion for change Drive process improvement Diligent, thorough, shows initiative and is proactive Ability to challenge status quo. People Management Ability to coach and mentor team Education / Certification Qualified Accountant- CA/CPA/ACCA/MBA or Post Graduate in Commerce How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 1 day ago

Apply

7.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title- QA - Manual Tester, AS Location- Pune, India Role Description This role is within the DWS Global Technology team and will interact with the various business groups globally, e.g. Global Client Group, Trading, Risk, Compliance and Finance. The developer will be responsible for delivering the technology projects with focus on the DWS Strategic projects and changes driven by upcoming regulatory milestones. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities As a Tester understand the business requirements and create comprehensive test cases as per end user’s experience, conduct functional, regression and end to end testing. Execute full SDLC process. Develop software verification plans and quality assurance procedures Develop and execute test scenarios, scripts and procedures for unit, system, integration, functional and acceptance testing. Drive continuous improvement in testing processes and methodologies. Monitor testing progress and provide regular status updates to stakeholders. Your Skills And Experience 7-8 years of QA & Testing experience. Asset Management Domain Knowledge Good understanding of QA Principles. Experience in running effective QA reviews Knowledge in Banking domain, especially Asset Management preferred. Ability to partner with Business Analyst to interpret functional specifications Ability to multi-task in a high-pressure environment Ability to pick up new product and technical knowledge quickly Ability to quickly learn new and complex processes Responsible for identifying, escalating and resolving project issues to achieve the smooth process flow. Excellent communication skills; fluent in English (written and verbal) How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 1 day ago

Apply

3.0 - 5.0 years

0 Lacs

Anklesvar, Gujarat, India

On-site

Linkedin logo

Key Responsibilities A Plant Operator in a chemical plant plays a crucial role in ensuring the safety and reliability of industrial processes. Here’s a detailed job description for this role: Operate the plant operations as per SOP. Adjusting controls and equipment to regulate temperature, pressure, and flow rate, and to manage chemical reactions. Conducting routine inspections of equipment and systems to detect any malfunctions and to perform necessary maintenance. Testing samples of raw materials or finished products to ensure they meet quality and safety standards. Recording data from operations, process conditions, and laboratory results for production logs and compliance reports. Collaborating with other team members such as chemists, engineers, and quality control to optimize production efficiency and safety. To maintain a disciplined & safe working environment by ensuring that all safety procedures are followed & practiced. Qualifications Education: Diploma Chemical 3 to 5 Years Skills: Communication Skills, Language Skills for English and able to operate basic comupter Show more Show less

Posted 1 day ago

Apply

7.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: Key Responsibilities: Automation & Functional Testing: Design, develop, and maintain automated test scripts using Selenium and other relevant tools. Ensure test coverage across all layers: UI, API, database, and backend systems. Execute functional, regression, and integration tests to validate application features. Performance Testing: Utilize JMeter to simulate load, stress, and scalability testing for applications. Analyze performance bottlenecks and collaborate with development teams to optimize system efficiency. API Testing: Leverage Postman for testing RESTful APIs and validating JSON responses. Develop reusable scripts for automated API testing using tools like RestAssured or similar frameworks. Database Validation: Write SQL queries to validate data integrity, perform database testing, and ensure proper data flow across systems. Unix Shell Scripting: Create and execute Unix shell scripts for log analysis, data processing, and automation tasks. Test Planning & Strategy: Develop comprehensive test plans, strategies, and scenarios to ensure high-quality deliverables. Identify gaps in test coverage and proactively address them with innovative solutions. Defect Management & Reporting: Troubleshoot defects, perform root cause analysis (RCA), and manage defect lifecycle within tools like Jira. Provide detailed and actionable bug reports to development teams. Collaboration & Leadership: Work closely with product managers, developers, and stakeholders to understand requirements and define test objectives. Mentor junior QA engineers, fostering a culture of quality and continuous improvement. Documentation: Maintain clear documentation for test cases, test results, and automation scripts. Ensure traceability between requirements, test cases, and defects. Required Qualifications: Experience: Minimum 7-10 years of experience in QA engineering, with strong expertise in automation and performance testing. Proven experience with the following tools and technologies: Selenium for UI automation JMeter for performance testing Postman for API testing SQL for database validation Unix Shell Scripting for automation and debugging Skills: Proficiency in JSON for validating API responses and data structures. Strong knowledge of QA methodologies, testing techniques, and SDLC phases. Experience with version control systems like Git. Familiarity with CI/CD pipelines and tools like Jenkins. Understanding of Agile frameworks and DevOps principles. Ability to debug code and identify defects in collaboration with developers. Preferred Skills (Nice to Have): Familiarity with cloud platforms (e.g., Azure) for deploying and testing applications. Knowledge of containerization tools like Docker and orchestration systems like Kubernetes. Experience with performance monitoring tools (e.g., New Relic, Dynatrace). Exposure to advanced scripting languages like Python or Java for test automation. Experience with Kafka for messaging systems and data validation. Knowledge of Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Data Engineer About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As a Data Engineer on our newly formed Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. You will play a critical role in the implementation of Data Warehousing and Data Lake solutions. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a dedicated team, leveraging your skills in the data engineering area to build solutions and drive innovation forward. ”. What You’ll Contribute Perform hands-on analysis, technical design, solution architecture, prototyping, proofs-of-concept, development, unit and integration testing, debugging, documentation, deployment/migration, updates, maintenance, and support on Data Platform technologies. Design, develop, and maintain robust, scalable data pipelines for batch and real-time processing using modern tools like Apache Spark, Kafka, Airflow, or similar. Build efficient ETL/ELT workflows to ingest, clean, and transform structured and unstructured data from various sources into a well-organized data lake or warehouse. Manage and optimize cloud-based data infrastructure on platforms such as AWS (e.g., S3, Glue, Redshift, RDS) or Snowflake. Collaborate with cross-functional teams to understand data needs and deliver reliable datasets that support analytics, reporting, and machine learning use cases. Implement and monitor data quality, validation, and profiling processes to ensure the accuracy and reliability of downstream data. Design and enforce data models, schemas, and partitioning strategies that support performance and cost-efficiency. Develop and maintain data catalogs and documentation, ensuring data assets are discoverable and governed. Support DevOps/DataOps practices by automating deployments, tests, and monitoring for data pipelines using CI/CD tools. Proactively identify data-related issues and drive continuous improvements in pipeline reliability and scalability. Contribute to data security, privacy, and compliance efforts, implementing role-based access controls and encryption best practices. Design scalable architectures that support FICO’s analytics and decisioning solutions Partner with Data Science, Analytics, and DevOps teams to align architecture with business needs. What We’re Seeking 7+ years of hands-on experience as a Data Engineer working on production-grade systems. Proficiency in programming languages such as Python or Scala for data processing. Strong SQL skills, including complex joins, window functions, and query optimization techniques. Experience with cloud platforms such as AWS, GCP, or Azure, and relevant services (e.g., S3, Glue, BigQuery, Azure Data Lake). Familiarity with data orchestration tools like Airflow, Dagster, or Prefect. Hands-on experience with data warehousing technologies like Redshift, Snowflake, BigQuery, or Delta Lake. Understanding of stream processing frameworks such as Apache Kafka, Kinesis, or Flink is a plus. Knowledge of data modeling concepts (e.g., star schema, normalization, denormalization). Comfortable working in version-controlled environments using Git and managing workflows with GitHub Actions or similar tools. Strong analytical and problem-solving skills, with the ability to debug and resolve pipeline and performance issues. Excellent written and verbal communication skills, with an ability to collaborate across engineering, analytics, and business teams. Demonstrated technical curiosity and passion for learning, with the ability to quickly adapt to new technologies, development platforms, and programming languages as needed. Bachelor’s in computer science or related field Exposure to MLOps pipelines MLflow, Kubeflow, Feature Stores is a plus but not mandatory Engineers with certifications will be preferred Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 day ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Preparing job descriptions, advertising vacant positions, and managing the employment process. Orientating new employees and training existing employees. Monitoring employee performance. Ensuring that all employees are organized and satisfied in their work environment. Overseeing the health and safety of all employees. Implementing systematic staff development procedures. Providing counseling on policies and procedures. Ensuring meticulous implementation of payroll and benefits administration. Communicating with staff about issues affecting their performance. Ensuring accurate and proper record-keeping of employee information in electronic and digital format. Identify trends and insights Allocate marketing investments Plan and direct marketing campaigns Manage an organization's website and maintain it, keeping best practices in mind Optimize content for the website and social media platforms Work with various content formats such as blogs, videos, audio podcasts, etc. Track the website traffic flow Implement and analyze performance metrics Measure and assess goals vis-à-vis ROI Device experiments and conversion tests Provide internal reports on a regular basis Execute new and creative collaborations among technologies and platforms Job Types: Full-time, Permanent Pay: ₹5,000.00 - ₹6,000.00 per month Schedule: Day shift Morning shift Ability to commute/relocate: Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Required) Education: Master's (Required) Language: English (Required)

Posted 1 day ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

About Fusemachines Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world. About The Role This is a remote full-time contractual position , working in the Travel & Hospitality Industry , responsible for designing, building, testing, optimizing and maintaining the infrastructure and code required for data integration, storage, processing, pipelines and analytics (BI, visualization and Advanced Analytics) from ingestion to consumption, implementing data flow controls, and ensuring high data quality and accessibility for analytics and business intelligence purposes. This role requires a strong foundation in programming and a keen understanding of how to integrate and manage data effectively across various storage systems and technologies. We're looking for someone who can quickly ramp up, contribute right away and work independently as well as with junior team members with minimal oversight. We are looking for a skilled Sr. Data Engineer with a strong background in Python , SQL , Pyspark , Redshift, and AWS cloud-based large-scale data solutions with a passion for data quality, performance and cost optimization. The ideal candidate will develop in an Agile environment. This role is perfect for an individual passionate about leveraging data to drive insights, improve decision-making, and support the strategic goals of the organization through innovative data engineering solutions. Qualification / Skill Set Requirement: Must have a full-time Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field 5+ years of real-world data engineering development experience in AWS (certifications preferred). Strong expertise in Python, SQL, PySpark and AWS in an Agile environment, with a proven track record of building and optimizing data pipelines, architectures, and datasets, and proven experience in data storage, modelling, management, lake, warehousing, processing/transformation, integration, cleansing, validation and analytics A senior person who can understand requirements and design end-to-end solutions with minimal oversight Strong programming Skills in one or more languages such as Python, Scala, and proficient in writing efficient and optimized code for data integration, storage, processing and manipulation Strong knowledge SDLC tools and technologies, including project management software (Jira or similar), source code management (GitHub or similar), CI/CD system (GitHub actions, AWS CodeBuild or similar) and binary repository manager (AWS CodeArtifact or similar) Good understanding of Data Modelling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions Strong SQL skills and experience working with complex data sets, Enterprise Data Warehouse and writing advanced SQL queries. Proficient with Relational Databases (RDS, MySQL, Postgres, or similar) and NonSQL Databases (Cassandra, MongoDB, Neo4j, etc.) Skilled in Data Integration from different sources such as APIs, databases, flat files, and event streaming Strong experience in implementing data pipelines and efficient ELT/ETL processes, batch and real-time, in AWS and using open source solutions, being able to develop custom integration solutions as needed, including Data Integration from different sources such as APIs (PoS integrations is a plus), ERP (Oracle and Allegra are a plus), databases, flat files, Apache Parquet, event streaming, including cleansing, transformation and validation of the data Strong experience with scalable and distributed Data Technologies such as Spark/PySpark, DBT and Kafka, to be able to handle large volumes of data Experience with stream-processing systems: Storm, Spark-Streaming, etc. is a plus Strong experience in designing and implementing Data Warehousing solutions in AWS with Redshift. Demonstrated experience in designing and implementing efficient ELT/ETL processes that extract data from source systems, transform it (DBT), and load it into the data warehouse Strong experience in Orchestration using Apache Airflow Expert in Cloud Computing in AWS, including deep knowledge of a variety of AWS services like Lambda, Kinesis, S3, Lake Formation, EC2, EMR, ECS/ECR, IAM, CloudWatch, etc Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent Good understanding of BI solutions, including Looker and LookML (Looker Modelling Language) Strong knowledge and hands-on experience of DevOps principles, tools and technologies (GitHub and AWS DevOps), including continuous integration, continuous delivery (CI/CD), infrastructure as code (IaC – Terraform), configuration management, automated testing, performance tuning and cost management and optimization Good Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issues Possesses strong leadership skills with a willingness to lead, create Ideas, and be assertive Strong project management and organizational skills Excellent communication skills to collaborate with cross-functional teams, including business users, data architects, DevOps/DataOps/MLOps engineers, data analysts, data scientists, developers, and operations teams. Essential to convey complex technical concepts and insights to non-technical stakeholders effectively Ability to document processes, procedures, and deployment configurations Responsibilities: Design, implement, deploy, test and maintain highly scalable and efficient data architectures, defining and maintaining standards and best practices for data management independently with minimal guidance Ensuring the scalability, reliability, quality and performance of data systems Mentoring and guiding junior/mid-level data engineers Collaborating with Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable components Evaluating and implementing new technologies and tools to improve data integration, data processing and analysis Design architecture, observability and testing strategies, and build reliable infrastructure and data pipelines Takes ownership of storage layer, data management tasks, including schema design, indexing, and performance tuning Swiftly address and resolve complex data engineering issues, incidents and resolve bottlenecks in SQL queries and database operations Conduct a Discovery on the existing Data Infrastructure and Proposed Architecture Evaluate and implement cutting-edge technologies and methodologies, and continue learning and expanding skills in data engineering and cloud platforms, to improve and modernize existing data systems Evaluate, design, and implement data governance solutions: cataloguing, lineage, quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns Define and document data engineering architectures, processes and data flows Assess best practices and design schemas that match business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive) Be an active member of our Agile team, participating in all ceremonies and continuous improvement activities Fusemachines is an Equal opportunity employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws. Powered by JazzHR SC1hyFVwpp Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Lead Backend Engineer About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As a Lead backend Engineer on our Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. We seek a highly skilled engineer with a strong foundation in digital product development, a zeal for innovation and responsible for deploying product updates, identifying production issues and implementing integrations. The backend engineer should thrive in agile, fast-paced environments, champion DevOps and CI/CD best practices, and consistently deliver scalable, customer-focused backend solutions. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a team, leveraging skills to build solutions and drive innovation forward.”. What You’ll Contribute Design, develop, and maintain high-performance, scalable Python-based backend systems powering ML and Generative AI products. Collaborate closely with ML engineers, data scientists, and product managers to build reusable APIs and services that support the full ML lifecycle—from data ingestion to inference and monitoring. Take end-to-end ownership of backend services, including design, implementation, testing, deployment, and maintenance. Implement product changes across the SDLC: detailed design, unit/integration testing, documentation, deployment, and support. Contribute to architecture discussions and enforce coding best practices and design patterns across the engineering team. Participate in peer code reviews, PR approvals, and mentor junior developers by removing technical blockers and sharing expertise. Work with the QA and DevOps teams to enable CI/CD, build pipelines, and ensure product quality through automated testing and performance monitoring. Translate business and product requirements into robust engineering deliverables and detailed technical documentation. Build backend infrastructure that supports ML pipelines, model versioning, performance monitoring, and retraining loops. Engage in prototyping efforts, collaborating with internal and external stakeholders to design PoVs and pilot solutions. What We’re Seeking 8+ of software development experience, with at least 3 years in a technical or team leadership role. Deep expertise in Python, including design and development of reusable, modular API packages for ML and data science use cases. Strong understanding of REST and gRPC APIs, including schema design, authentication, and versioning. Familiarity with ML workflows, MLOps, and tools such as MLflow, FastAPI, TensorFlow, PyTorch, or similar. Strong experience building and maintaining microservices and distributed backend systems in production environments. Solid knowledge of cloud-native development and experience with platforms like AWS, GCP, or Azure. Familiarity with Kubernetes, Docker, Helm, and deployment strategies for scalable AI systems. Proficient in SQL and NoSQL databases and experience designing performant database schemas. Experience with messaging and streaming platforms like Kafka is a plus. Understanding of software engineering best practices, including unit testing, integration testing, TDD, code reviews, and performance tuning. Exposure to frontend technologies such as React or Angular is a bonus, though not mandatory. Experience integrating with LLM APIs and understanding of prompt engineering and vector databases. Exposure to Java or Spring Boot in hybrid technology environments will be a bonus. Excellent collaboration and communication skills, with a proven ability to work effectively in cross-functional, globally distributed teams. A bachelor’s degree in Computer Science, Engineering, or a related discipline, or equivalent hands-on industry experience. Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 day ago

Apply

1.0 - 3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Sikich is seeking an Assurance Experienced Associate (EBP) with 1-3 years of experience to join our team. The candidate will be overseeing and executing financial statement audits ensuring compliance with US GAAP. Experience in US GAAP general accounting and financial statement review is preferable. About the firm Sikich is a global company specializing in Accounting, Advisory, and Technical professional services. With employees across the globe, Sikich ranks as one of the largest professional services companies in the United States. Our comprehensive skillsets, obtained over decades of experience as entrepreneurs, business owners and industry innovators, allow us to provide insights and transformative strategies to help strengthen every dimension of our clients’ businesses. Job Responsibilities Work on a variety of different auditing projects Coordinate daily client interactions and ensure the efficient information flow from the US teams to ensure timely completeness of tasks assigned Ability to simultaneously run multiple engagements of varying size and complexity Excel in a dynamic work environment servicing a variety of EBP clients Demonstrate a working knowledge of the general aspects of the regulatory environment surrounding employee benefit plans Supervise audit associates and interns on engagements and providing coaching, timely feedback, and reviewing their audit documentation Identify and communicate potential issues and opportunities for audit efficiencies and process improvement to Managers and Principals Consult with US Leadership/clients on various internal accounting related transactions, as needed Development of one-on-one relationships with US-based audit leads Drive quality project deliverables Participate in Training and Development to hone skills of peers and self Prepare audit reports and statements for review Knowledgeably answer client audit queries in good time. Requirements for Successful Candidate Any graduate with minimum 1 year experience in performing EBP Audits or CA/ACCA/CPA (Qualified or Pursuing) Self-motivated with strong work ethic Organizational skills to provide client reports within scheduled time frames Proactive approach to accuracy and attention to detail Knowledge of QuickBooks™ and other US accounting systems Proficiency in intermediate Microsoft Excel and MS Office Strong interpersonal and exceptional communication skills Possesses a combination of both problem-solving and innovation skills to attend to several technical production challenges Benefits of being a part of the team Family Health Insurance including parents Life & Accident Insurance Maternity/paternity leave Performance-based incentives Referral Bonus program Exam Fee Reimbursement Policy Indian festival holidays 5 days working week Meals facility Doctor's Consultation Website www.sikich.com Show more Show less

Posted 1 day ago

Apply

12.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

About the Company : A fast-growing, consumer-focused brand in the apparel and lifestyle segment, known for its strong direct-to-consumer (D2C) presence and trend-driven product offerings. The company operates at the intersection of fashion, innovation, and digital retail, delivering high-quality lifestyle products that resonate with modern consumers. Key Responsibilities : Cash Flow & Financial Discipline : Manage daily, weekly, and monthly cash flow planning to ensure optimal liquidity and working capital management. Oversee all banking, receivables, vendor payments, and reconciliation activities. Build and manage robust cash forecasting models integrated with inventory cycles and sales targets. Department-Level Budgeting & Attribution : Own annual and quarterly budgeting processes across all business functions (marketing, operations, tech, HR, retail, etc.). Ensure each department operates within approved budgets with proactive variance analysis Establish accurate cost attribution to departments for P&L reporting and unit economics analysis. Partner with department heads to align financial goals with operational strategies. P&L Management & Cost Optimization : Lead preparation and analysis of full P&L across sales channels and product categories. Identify cost leakages and margin improvement opportunities. Track logistics costs, returns, platform commissions, warehousing, and overheads to ensure profitability targets are met. Reconciliation & Compliance : Drive comprehensive reconciliation across all revenue streams (marketplace, D2C, retail), payment gateways, vendor accounts, and taxes. Ensure timely filing and accuracy in GST, TDS, income tax, and ROC compliance. Implement SOPs for financial accuracy, closing cadence, and documentation. Audits & Group Collaboration : Serve as the primary point of contact for external audits; ensure clean and timely audit closures. Work closely with Pnfinance teams for reporting, compliance alignment, and shared systems. Regularly interface with external auditors, consultants, and legal teams for process and statutory reviews. Team Building & Systems Build and lead a high-performance finance, accounts, and compliance team. Guide ERP implementation, dashboarding, and internal reporting frameworks for scalability. Ideal Candidate : Chartered Accountant (CA) is mandatory, MBA in Finance with 12+ years of experience in finance leadership. Company Secretary is also accepted. Proven expertise in managing departmental budgets, cost attribution, reconciliations, and audits. Prior experience in fashion, retail, or consumer brands is highly preferred Strong familiarity with marketplace settlements, payment gateways, and D2C accounting. High attention to detail, strong communication skills, and comfort with a high-velocity, high-accountability work culture. Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

SDET Engineer – Backend and Data-Driven Applications Location: Bangalore About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As an SDET engineer on our Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. We seek a highly skilled engineer with a strong foundation in digital product development, a zeal for innovation and responsible for deploying product updates, identifying production issues and implementing integrations. The SDET should thrive in agile, fast-paced environments, champion test automation and CI/CD quality gates, and consistently deliver robust, customer-centric validation for data-driven and backend systems. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a team, leveraging skills to build solutions and drive innovation forward.”. What You’ll Contribute Design and implement robust test plans and strategies to validate APIs, backend services, and data-intensive workflows across ML and GenAI product stacks. Perform hands-on testing (manual and automated) across functional, regression, usability, and performance layers — including both black-box and grey-box testing techniques. Build and maintain automation frameworks for both API and UI layers, enabling continuous and reliable validation across environments. Collaborate closely with Data Engineers, Backend Engineers, and MLOps teams to test ETL pipelines, data transformations, and model deployment workflows. Write and execute automated tests using tools such as Selenium, RestAssured, Pytest, or Postman to validate both synchronous and asynchronous system behaviors. Execute complex SQL queries and data validations across RDBMS and NoSQL stores to ensure data accuracy and integrity in production-like environments. Integrate tests with CI/CD pipelines (e.g., GitHub Actions, Jenkins, Argo Workflows), and enable shift-left testing practices as part of the engineering workflow. Evaluate test results, identify root causes, and log issues in defect tracking tools such as JIRA; drive continuous quality improvements and regression stability. Partner with QA leadership and development teams to assess test coverage, identify quality gaps, and champion testability and observability as core design principles. Participate in release planning, sprint ceremonies, and provide quality signals and product readiness throughout the SDLC lifecycle What We’re Seeking 6+ years of experience in software quality engineering, preferably with experience validating backend and data-heavy systems. Deep understanding of QA methodologies, software testing life cycle, and test automation design patterns. Proficient in Java or Python for test automation and scripting. Hands-on experience building automation frameworks for REST APIs, Web Services, and microservices. Strong SQL skills and experience validating data pipelines, relational and NoSQL databases. Familiarity with cloud platforms (AWS preferred), containerization (Docker), and CI/CD tools like GitHub Actions or Jenkins. Solid understanding of Agile and Scrum methodologies; experience working in fast-paced, iterative development cycles. Proficiency with test management and defect tracking tools (e.g., JIRA, QTest, TestRail, Quality Center). Strong debugging and triaging skills, with a knack for identifying edge cases and performance bottlenecks. Strong communication, problem-solving, and collaboration skills, particularly in cross-functional teams including backend, ML, and DevOps stakeholders. Excellent collaboration and communication skills, with a proven ability to work effectively in cross-functional, globally distributed teams. A bachelor’s degree in Computer Science, Engineering, or a related discipline, or equivalent hands-on industry experience. Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Lead Backend Engineer About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As a Lead backend Engineer on our Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. We seek a highly skilled engineer with a strong foundation in digital product development, a zeal for innovation and responsible for deploying product updates, identifying production issues and implementing integrations. The backend engineer should thrive in agile, fast-paced environments, champion DevOps and CI/CD best practices, and consistently deliver scalable, customer-focused backend solutions. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a team, leveraging skills to build solutions and drive innovation forward.”. What You’ll Contribute Design, develop, and maintain high-performance, scalable Python-based backend systems powering ML and Generative AI products. Collaborate closely with ML engineers, data scientists, and product managers to build reusable APIs and services that support the full ML lifecycle—from data ingestion to inference and monitoring. Take end-to-end ownership of backend services, including design, implementation, testing, deployment, and maintenance. Implement product changes across the SDLC: detailed design, unit/integration testing, documentation, deployment, and support. Contribute to architecture discussions and enforce coding best practices and design patterns across the engineering team. Participate in peer code reviews, PR approvals, and mentor junior developers by removing technical blockers and sharing expertise. Work with the QA and DevOps teams to enable CI/CD, build pipelines, and ensure product quality through automated testing and performance monitoring. Translate business and product requirements into robust engineering deliverables and detailed technical documentation. Build backend infrastructure that supports ML pipelines, model versioning, performance monitoring, and retraining loops. Engage in prototyping efforts, collaborating with internal and external stakeholders to design PoVs and pilot solutions. What We’re Seeking 8+ of software development experience, with at least 3 years in a technical or team leadership role. Deep expertise in Python, including design and development of reusable, modular API packages for ML and data science use cases. Strong understanding of REST and gRPC APIs, including schema design, authentication, and versioning. Familiarity with ML workflows, MLOps, and tools such as MLflow, FastAPI, TensorFlow, PyTorch, or similar. Strong experience building and maintaining microservices and distributed backend systems in production environments. Solid knowledge of cloud-native development and experience with platforms like AWS, GCP, or Azure. Familiarity with Kubernetes, Docker, Helm, and deployment strategies for scalable AI systems. Proficient in SQL and NoSQL databases and experience designing performant database schemas. Experience with messaging and streaming platforms like Kafka is a plus. Understanding of software engineering best practices, including unit testing, integration testing, TDD, code reviews, and performance tuning. Exposure to frontend technologies such as React or Angular is a bonus, though not mandatory. Experience integrating with LLM APIs and understanding of prompt engineering and vector databases. Exposure to Java or Spring Boot in hybrid technology environments will be a bonus. Excellent collaboration and communication skills, with a proven ability to work effectively in cross-functional, globally distributed teams. A bachelor’s degree in Computer Science, Engineering, or a related discipline, or equivalent hands-on industry experience. Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Andaman and Nicobar Islands, India

On-site

Linkedin logo

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Design, programmes, debugs, and modify software enhancements and new products used in local, networked, or Internet-related computer programmes. Code may be used in commercial or end-user applications, such as materials management, financial management, HRIS or desktop applications products. Using current programming language and technologies, writes code, completes programming, and performs testing and debugging of applications. Complete documentation and procedures for installation and maintenance. May work with users to define system requirements and necessary modifications. You will report to the Cyber Security Manager, and work in a hybrid capacity from our Hinjewadi - Pune, India office. Your Responsibilities Take end-to-end ownership of customer issues, including initial troubleshooting, identification of cause, and issue resolution. Support Identity Access management cases and Directory Services group in Service Now, including Customer side when is need it. Monitor scheduled jobs, account aggregations, active workflows, as defined in the IAM Operations Manual. Acknowledge and review Incident/request tickets assigned by the Incident Management Tool. Build test cases as defined in the IAM Operations Manual. Collect Role data for disconnected applications from application teams for annual access certification. Certificate renewal, Shared keys creation, and renewal keys. Onboarding new applications. Work on Customer cases for login issues, MFA, or any disconnect application for customer side. Maintain existing and create new SOP's, Flow chart or any Tech Documentations. Help training end-users and colleagues. Respond to and resolve SSO and MFA related incidents, including login issues, authentication errors, and access problems. Help with user provisioning and deprovisioning related to SSO and MFA access levels. Provide technical assistance to users regarding SSO and MFA login procedures, password reset, and device registration. Resolve complex SSO and MFA integration issues across several applications. Maintain SSO and MFA configurations within the chosen platform, ensuring security policies are followed. Monitor SSO and MFA systems for performance issues, potential security threats, and user activity. Configure and Stage Certification Provide end-user support when, for questions related to access certification campaign process for annual and quarterly access certification. Follow up users with open Access Certification Tasks to remind them of outstanding tasks and assist with tool navigation and questions. Kickoff and Closing Certification data for Audit. Create Access Certification Reports Forward Certification reports to HR for quarterly contractor certification or disconnected application owners for annual certifications. Be a subject matter expert within support for OpsIAM. The Essentials - You Will Have 6+years' experience supporting and troubleshooting cloud-based directory services such as Active Directory Services, Azure, SSO, MFA, ADFS, Okta or Auth0. Experience with REST API integration, working knowledge with: Microsoft Word, Excel, PowerPoint, Power Apps, Power BI, ServiceNow ITSM, GitLab, DevOps, SharePoint, Postman, and SQL Server management. A familiarity with usage scenarios of professional experience with one or more scripting/programming languages such as SQL, Bash, PowerShell, C++, Java, Python, JavaScript, C #, JSON, .NET to integrate solutions, increase capabilities, identify opportunities, and ease administration. Experience with manual and automated testing principles, methodologies, techniques, and tools, such as Selenium, Junit or similar. Comfortable leading change in areas outside of subject matter expertise. Good Customer skill and communication. The Preferred - You Might Also Have Bachelor's degree in management information systems, Computer Science, a related IT field or open field with IT experience. Experience administering and supporting Sail Point, or similar solutions. Experience working knowledge of SSO, PAM, AD/AAD, and MFA. Experience Jira/Kanban methodologies. What We Offer Our benefits package includes … Comprehensive mindfulness programmes with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching programme – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalised wellbeing programmes through our OnTrack programme On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are hiring Freshers- US International Voice Process, who are enthusiastic and ready to kickstart their career in a dynamic work environment. JOB DESCRIPTION To learn and perform tasks related to collecting payment from insurance companies Contacting insurance companies and patients to resolve billing issues Maintaining accurate records of interactions and claim statuses working with billing , insurance verification and finance teams to ensure accurate billing and revenue flow Skills and Qualities Excellent Verbal and written communication skills are essential for interacting with process and patients Must be detail - oriented to ensure the process Should be motivated to achieve positive outcomes JOB DETAILS Shift : Fixed night shift Experience: Freshers - Immediate Joiners only Location: Chennai (Vadapalani) Transportation: Two-way cab will be provided Mode of interview: Walk-in If you're interested, Kindly Walk in to the below mentioned location only during weekdays (Saturday & Sunday- non working days) Show more Show less

Posted 1 day ago

Apply

175.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? The Global Treasury Controllership (GTC) team is part of Regulatory Reporting and Treasury Controllership Team and is responsible for providing comprehensive Treasury and regulatory reporting support covering all Treasury transactions, including cash, debt, equity, derivative, and investments. Treasury Reporting COE is responsible for the timely and accurate delivery of all the SEC, FED, and LE reporting deliverables for various Treasury Products; Debt, Investments, Derivatives, Reverse Repos and AOCI. This is an exciting opportunity to establish and lead the SEC and FED reporting for American Express. The position will involve close interactions with a diverse stakeholder group on Treasury products including conclusion on accounting and reporting implications and daily monitoring of results owing to AXP’s category change. Responsibilities include: Assist with the timely completion of the quarterly and annual consolidated financial statement filings with the SEC (10-K,10-Q, etc.) and various other statutory reporting requirements of AXP and its subsidiaries. Assist the quarterly analytics forming part of Financial Analysis Book (FAB) shared with senior leadership and other key stakeholders. Preparation of submissions/ supporting information used for Reg reports, footnotes, cash flow submissions and MD&A. Work with business partners (GTC, Regulatory Reporting team and Treasury) and support monitoring of daily results and its impact on the liquidity ratios. Effective controls to ensure compliance with SOX, Bank Holding Company regulations and numerous internal guidelines. Support audit queries and look for opportunities to drive process efficiencies via automation, etc. Additional responsibilities include participation in internal and other business initiatives Minimum Qualifications CA/CPA or equivalent plus 0-2 years of experience in finance and reporting US GAAP knowledge in the areas of financial Instruments will be a plus Analytical and problem-solving skills. Strong communication skills Preferred Qualifications CA/CPA or equivalent plus 0-2 years of experience in finance and reporting High level of proficiency with Microsoft Office; excellent Excel skills. Advance MS Office suits (Word, PowerPoint) Power BIEE usage and Tableau dashboard skills will be a plus We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: Data Engineer Location: Gurugram (On-site) Company: Darwix AI Employment Type: Full-Time About the Role: Darwix AI is looking for a Data Engineer with strong experience in Python, CI/CD pipeline setup, and system architecture design. You will be responsible for building and optimizing data pipelines, ensuring seamless integration of data across systems, and supporting scalable infrastructure to power our AI-driven solutions. Key Responsibilities: Design, implement, and maintain efficient data pipelines and workflows Set up and manage CI/CD pipelines for automated deployment and testing Develop scalable system architecture for real-time and batch data processing Collaborate with cross-functional teams to deliver production-ready data solutions Monitor data flow, ensure data integrity, and optimize pipeline performance Requirements: Proficiency in Python with a strong understanding of data structures and algorithms Experience in setting up CI/CD pipelines using tools like GitLab CI, Jenkins, or similar Solid understanding of system architecture and distributed data processing Familiarity with cloud platforms (AWS/GCP/Azure) and containerization tools like Docker 3–5 years of experience in a similar data engineering role Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Rotork is the market-leading global flow control and instrumentation company, helping our customers manage the flow of liquids, gases and powders across many industries worldwide. Job Description Purpose of the Job We are seeking a highly skilled and experienced Configuration Lead to join our team who are responsible for maintaining our Dynamics 365 platform. The ideal candidate will be responsible for governing the configuration of D365 CE and FinOps to ensure there is control across systems and changes being made in BAU and projects are aligned to prevent conflicts. Key Responsibilities and Outcomes Lead the configuration governance of Dynamics 365 solutions to meet business needs. Collaborate with stakeholders to ensure configuration changes needed are made in the right environments to support BAU and project requests. Provide technical guidance to stakeholders on requirements for configuration to be completed e.g. specifics of configuration files to ensure smooth uploads. Troubleshoot and resolve issues related to Dynamics 365 configuration uploads / movements between environments. Conduct training sessions and workshops for end-users to ensure effective ways of working on configuration management. Maintain documentation of configurations and input into solution design documentation where required. Communicating with key stakeholders including Project Managers, Architects and Developers to minimise conflicts in requests for configuration changes in different environments. Managing the interactions between the BAU and D365 rollout programme deployment tracks, in terms of configuration upload requests. Qualifications Qualifications & Technical knowledge: Proven experience in configuring Dynamics 365 solutions. Strong understanding of Dynamics 365 systems (CE, FinOps, HR) and applications (Sales, Customer Service, Field Service, Marketing, Finance, Supply Chain Management). Excellent problem-solving skills and attention to detail. Strong communication skills. Ability to work independently and as part of a team. Relevant certifications in Dynamics 365 are preferred. Strong working knowledge of Microsoft Power Platform and related technologies. Technical expertise of Microsoft Dynamics 365 ERP & CRM solutions. Ability to create and update documentation of processes to be shared with technical and non-technical audiences. Personal Specification: Analytical thinker – resolve problems with a strong focus on attention to detail. Stakeholder management – engage and motivate others. Customer and business focused at all times. Adaptable – cope with the unexpected, manage problems. Communication – excellent listening plus written and spoken skills. Empathise and understand different cultures. Organisation – self motivated with good time management skills to manage own workload. Additional Information Our purpose is Keeping the World Flowing for Future Generations. For over sixty years, the world has relied on us to create the things that keep everything moving. From oil and gas to water and shipping, pharmaceuticals and food- these are the flows on which our modern world depends. Today we're respected and admired for our people, performance and products. Our success flows from our commitment to engineering excellence, and that's what we will always pursue, safely and sustainably. Rotork is going through an exciting period of change and growth, building on our existing market success. It's a great time to join us and make an impact in shaping the future of our business. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Leadership Skills: Should be able to manage a large team of 20+ professionals, demonstrating strategic leadership and team collaboration. Will be responsible for broader discretion of hiring and firing, recommendation of ratings and promotions basis the performance. Technical Skills: Advanced expertise in data analytics and data visualization techniques. Proficient in data engineering, including data transformation, integration, acquisition, preparation, modeling, and master data management. Extensive experience in implementing visualization projects using tools like MS Power BI, Tableau, Spotfire, and others. Proficient with Microsoft BI technologies (SQL, SSIS, SSAS, SSRS) and Azure-BI solutions such as ADF, Synapse, and Databricks. Programming skills in SQL, DAX, MDX, Python, Power Shell Scripting, Node JS, React JS, C++, with proficiency in databases like MS-SQL Server, Access, and MySQL. Proficiency in developing and managing Excel macro-based dashboards. Proficiency in performance tuning and optimization for processing large datasets, ensuring efficient data retrieval and minimal response times. Experience with partitioning strategies, indexing, and query optimization to manage and expedite data access in high-volume environments. Skilled in using big data technologies and distributed computing frameworks to address scalability and performance challenges in large-scale data processing. Expertise in designing and optimizing data pipelines and ETL processes to improve data flow and reduce bottlenecks in extensive datasets. Familiarity with advanced analytics and machine learning algorithms for efficient processing and analysis of massive datasets to derive actionable insights. Knowledge of cloud-based data services and tools for scalable storage, analysis, and management of large volumes of data, including Azure Synapse, Snowflake, and Amazon Redshift. Soft Skills: Effective communication, analytical thinking, and problem-solving abilities. Managerial Roles: As a Manager at EY GDS, one should be capable of designing and delivering analytics foundations, managing a team, constructing dashboards, and employing critical thinking to resolve complex audit and non-audit issues. The role involves developing, reviewing, and analyzing solution architecture, gathering and defining requirements, leading project design, and overseeing implementation. He/She is responsible for owning the engagement economics of the team, updating key findings to the leadership, and assisting in alignment in case of discrepancies. It is essential to align and collaborate with the Service Delivery Manager (SDM) from various Digital delivery regions to perform project scoping, estimations, and strategically drive the deliveries to success. The Manager should identify the high and low outliers in the team and help align low outliers with the right learning paths to support their career alignments. Own the learning and development of the team and periodically revisit the learnings and advice the team and align them as per the emerging market trends. Perform R&D and produce POC that can prove the various capability of the team in implementing advanced concepts in visualizations and organize calls with various groups to explain the features and benefits. Try and implement it in engagements to lead the success. Should have periodical alignment on resource performance deployed in various engagements. Prioritise and assist team in generating the automation savings with unique ideas and with the help of cutting-edge implementations. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

Remote

Linkedin logo

Sprinto is a leading platform that automates information security compliance. By raising the bar on information security, Sprinto ensures compliance, healthy operational practices, and the ability for businesses to grow and scale with unwavering confidence. We are a team of 3 00+ employees & helping 2000+ Customers across 75+ Countries . We are funded by top investment partners Accel, Elevation, and Blume Ventures and have raised 31.8 million USD in funding, including our latest Series B round. Role Overview - As a Senior Associate - Legal at Sprinto, you’ll help keep our commercial and regulatory machinery humming. You’ll own day-to-day contract work across global jurisdictions, drive privacy and compliance initiatives, and—critically—deploy AI tooling to slash review cycles and free the business to move faster. Key Responsibilities - AI-Driven Contract Velocity - Leverage generative-AI and CLM platforms (e.g., GPT-powered review assistants, automated clause libraries) to cut contract turnaround time and boost redline consistency Continuously optimise prompts, playbooks, and workflows; track and report cycle-time improvements to leadership Contract Negotiation & Management - Draft, review, and negotiate SaaS/enterprise agreements (MSAs, DPAs, BAAs, NDAs, partner and reseller deals) in close collaboration with Sales and Finance Privacy & Security Compliance - Ensure Sprinto’s practices comply with GDPR, CCPA, HIPAA, and emerging regulations; coordinate DPIAs and vendor assessments Risk Mitigation & Advisory - Identify legal risks in product launches and GTM motions, proposing pragmatic, business-friendly solutions Legal Operations - Maintain clause banks, playbooks, and the contract-lifecycle system; refine processes to scale efficiently across time zones Cross-Functional Collaboration -Partner with Product, RevOps, and Engineering to align on data-flow maps, infosec controls, and customer commitments IP & Dispute Support - Support trademark filings, open-source software reviews, and pre-litigation matters, escalating to outside counsel when needed Key Requirements - 2–4 years of post-qualification experience focused on SaaS, technology licensing, or enterprise software contracts Demonstrated hands-on proficiency with AI-based legal tools (e.g., GPT contract reviewers, automated redlining, CLM analytics) and a track record of cutting contract cycle times Deep knowledge of global data-privacy regimes (GDPR, CCPA, HIPAA) and related security standards Polished negotiator who can translate dense legalese into crisp, business-oriented advice Experience supporting global sales and product teams; comfortable juggling multiple priorities across time zones Pragmatic mindset: you default to “yes-and-here’s-how” rather than “no” Bonus: exposure to AI governance frameworks or security certifications (ISO 27001, SOC 2) Benefits - Remote First Policy 5 Days Working With FLEXI Hours Group Medical Insurance (Parents, Spouse, Children) Group Accident Cover Company Sponsored Device Education Reimbursement Policy Show more Show less

Posted 1 day ago

Apply

4.0 - 6.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Welcome to Jio BLAST and the Commercial Team! At BLAST we create live and digital experiences - from tournaments that pack out major arenas around the world to great content that's guaranteed to blow your socks off. BLAST have now combined forces with Jio to entertain India's youth and redefine Indian esports. Globally we work with the world's best game publishers and brands to elevate their properties into amazing esports experiences. Now it's time for India. Launched in 2025 and headquartered in Mumbai, Jio BLAST is assembling a passionate, ambitious, and creative team. This is your opportunity to be part of shaping BLAST's vision for esports in India. The Commercial Team will be responsible for fostering brand partnerships and bringing commercial revenue into the business. Our vision is to create esports and gaming IPs and spectacles that exceed traditional sport in entertainment and attention. The Commercial Team's role will be to go to market and create meaningful, long-term and high value partnerships. This role will report to the leader of the Commercial Team and be the driving force behind identifying, pitching, and closing brand deals. You will work closely with agencies and BLAST's global teams in London, New York and Copenhagen to identify opportunities. We're here to build new entertainment products with the biggest games publishers and brands in the world with the number one goal of entertaining India's youth and creating new heroes. At BLAST, our company values guide everything we do: We Make It Bold by experimenting and pushing boundaries We Make It Together by empowering each other and solving challenges as a team And we Make It Happen by taking ownership and delivering with precision Key Responsibilities Own the sales process: Prospect, pitch, negotiate, and close brand sponsorships for esports IPs, tournaments, and content series Work independently on identified opportunities, while collaborating closely with the Commercial Team leader and wider team on major deals Build brand relationships: Cultivate a strong personal network with Indian brands, agencies, and marketing decision-makers Package & pitch: Build compelling decks and proposals tailored to each brand's objectives and aligned with the IP's format and fanbase Support rights development: Contribute to the shaping of commercial rights (what's being sold, how it's packaged) to ensure we're selling what the market wants and designing IPs in a commercially savvy way. Track performance: Maintain a strong pipeline and report progress on revenue targets, deal flow, and lead conversion Requirements 4-6 years in brand partnerships, media sales, sponsorship sales, or related commercial roles Experience working with or selling to Indian brands and agencies, ideally in sports, entertainment, media, or digital content Proven track record of closing mid-to-high value deals and maintaining long-term client relationships Exposure to building sponsorship solutions from scratch or selling new IPs is a strong plus Strong communicator and storyteller - able to pitch creatively and with confidence Prior experience in gaming/esports is helpful but not essential - curiosity and adaptability are more important Benefits Competitive compensation and employee benefits Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Rotork is the market-leading global flow control and instrumentation company, helping our customers manage the flow of liquids, gases and powders across many industries worldwide. Job Description Job Purpose Transitioning QA processes to Business-as-Usual (BAU) by working closely with our System Integrator (SI) and Business Integrator (BI) partners. Responsible for managing the regression automation pack, maintaining an up-to-date test library, and verifying new Microsoft feature releases against current business processes Responsibilities Work alongside the BI and SI partners to enhance and update the library of test cases, test scripts, and test scenarios in line with evolving business and technical requirements Test and validate new Microsoft D365 feature releases, assessing and documenting their impact on existing functionality Collaborate with SI and BI partners to align on test automation framework and practices Participate in design, planning, and refinement sessions to ensure QA considerations are built into upcoming work Contribute to the transition of QA and testing assets into BAU, including documentation, processes, and tool handover Support test data management and coordinate with relevant teams to ensure availability of clean and reusable data sets Maintain and execute the automated regression test pack, ensuring it reflects the latest process and system updates Test execution and test defect reporting Track and report on defects, issues, and test outcomes using agreed QA tools and reporting frameworks Work closely with business analysts, developers, and support teams to ensure alignment and traceability from requirements to test execution Ensure compliance with test governance standards and participate in continuous improvement of QA practice Qualifications Essential Proven experience in Quality Assurance / Test Engineering, preferably within D365 or ERP environments Strong understanding of test automation principles and hands-on experience with automation frameworks and tools (e.g., Tosca, Power Platform test tools) Familiarity with Microsoft Dynamics 365 Finance & Operations and Customer Engagement Experience managing and executing regression test suites in large-scale programs Proficient in test case design, version control, and defect lifecycle management Comfortable working in Agile or hybrid delivery environments, using Azure DevOps Excellent analytical and problem-solving skills Strong documentation and communication skills, including the ability to explain technical QA matters to non-technical stakeholders Required Competencies Detail-oriented with a methodical and consistent approach to testing Strong communicator with the ability to collaborate across cross-functional and vendor teams Proactive and solution-focused mindset Ability to adapt to change and work under pressure in a dynamic programme environment Commitment to quality and continuous improvement Additional Information Our purpose is Keeping the World Flowing for Future Generations. For over sixty years, the world has relied on us to create the things that keep everything moving. From oil and gas to water and shipping, pharmaceuticals and food- these are the flows on which our modern world depends. Today we're respected and admired for our people, performance and products. Our success flows from our commitment to engineering excellence, and that's what we will always pursue, safely and sustainably. Rotork is going through an exciting period of change and growth, building on our existing market success. It's a great time to join us and make an impact in shaping the future of our business. Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. SAP DATASPHERE Senior Job Description Job Summary: The SAP Datasphere Specialist will be responsible for the design, development, and maintenance of data architecture within SAP Datasphere. The ideal candidate should have extensive hands-on knowledge of SAP Datasphere artifacts, including Spaces, Local Tables, Views, Data Flow, Replication Flow, Transformation Flow, DAC, Task Chain, and Intelligent Lookup. The role requires experience in data provisioning from both SAP and non-SAP sources, replication flow, and space management. Additionally, the candidate must be familiar with S/4HANA CDS views, BW/4HANA concepts, and possess advanced SQL skills. Skills and Experience: Develop and maintain Spaces, Local Tables, Views, Data Flow, Replication Flow, Transformation Flow, DAC, Task Chain, and Intelligent Lookup within SAP Datasphere. Manage data provisioning from SAP and non-SAP sources, ensuring seamless integration and data consistency. Design Replication Flows to move data out from Datasphere . Oversee Space management within SAP Datasphere, optimizing storage and performance. Collaborate with cross-functional teams to ensure that data architecture aligns with business requirements and IT standards. Provide expertise on S/4HANA CDS views and BW/4HANA concepts to support data modeling and reporting needs. Utilize reporting tools such as AFO, SAC, or Power BI to create insightful data visualizations and reports. Troubleshoot and resolve data-related issues, applying strong problem-solving skills. Work independently and as part of a team to deliver high-quality data solutions. Qualifications Minimum 5+ years of SAP Analytics/Business Intelligence/Business Warehouse (BI/BW/HANA) related experience leading and delivering full life cycle implementations. Minimum 2 end to end implementation experience with SAP BW/4HANA, S/4HANA CDS Bachelor's degree from an accredited college/university. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 day ago

Apply

Exploring Flow Jobs in India

Flow jobs in India are in high demand as companies are increasingly looking for professionals who can streamline processes, optimize workflows, and improve efficiency. Whether it's in the tech industry, finance sector, or even healthcare, individuals with expertise in flow management are sought after for their ability to drive results and make a significant impact on organizations.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for flow professionals in India varies based on experience and location. Entry-level positions may start at around ₹3-5 lakhs per annum, while experienced professionals could earn anywhere from ₹10-20 lakhs per annum or more.

Career Path

A typical career path in the field of flow management may progress as follows: - Flow Analyst - Process Improvement Specialist - Flow Manager - Director of Operations

Related Skills

Besides expertise in flow management, professionals in this field may also benefit from having skills such as: - Data analysis - Project management - Six Sigma certification - Lean methodologies

Interview Questions

  • What is flow management and why is it important? (basic)
  • Can you give an example of a successful flow optimization project you have worked on? (medium)
  • How do you identify bottlenecks in a process and what strategies do you use to address them? (medium)
  • What tools or software do you typically use to analyze workflows and processes? (basic)
  • How do you ensure continuous improvement in flow management practices within an organization? (advanced)
  • Describe a challenging situation you faced while optimizing a process and how you overcame it. (medium)
  • What metrics do you use to measure the effectiveness of flow management initiatives? (medium)
  • How do you prioritize tasks when working on multiple flow projects simultaneously? (basic)
  • Can you explain the difference between flow efficiency and flow effectiveness? (advanced)
  • What are some common obstacles to achieving flow optimization and how do you mitigate them? (medium)
  • How do you communicate the benefits of flow management to stakeholders who may be resistant to change? (medium)
  • What role does technology play in modern flow management practices? (basic)
  • How do you stay updated on the latest trends and best practices in flow management? (basic)
  • Describe a time when you had to make a difficult decision in optimizing a process. How did you approach it? (medium)
  • Can you walk us through your process for developing a flow management strategy from start to finish? (advanced)
  • How do you ensure that workflow changes are implemented effectively and sustained over time? (medium)
  • What are the key elements of a successful flow management plan? (basic)
  • How do you foster collaboration and communication among team members involved in flow optimization projects? (medium)
  • Can you provide an example of a time when you had to adjust your flow management approach in response to unexpected challenges? (medium)
  • How do you handle resistance to change from employees when implementing new flow processes? (medium)
  • What are the most common mistakes organizations make when trying to optimize their workflows and processes? (medium)
  • How do you balance the need for efficiency with the need for quality in flow management initiatives? (medium)
  • What are your thoughts on the future of flow management and its impact on businesses in India? (advanced)
  • How do you ensure that flow management practices align with the overall strategic goals of an organization? (advanced)

Closing Remark

As you prepare for your next flow job interview, remember to showcase your expertise, experience, and passion for improving processes and driving efficiency. By confidently highlighting your skills and accomplishments in flow management, you can stand out as a top candidate and secure exciting opportunities in the dynamic job market in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies