Job Title: React Developer Location: Kolkata, On-site Experience Required: 3-8 Years Notice Period: Immediate Joiners Preferred Job Summary We are looking for a skilled React Developer with 3-8 years of professional experience to join our dynamic team in Kolkata. The ideal candidate should have hands-on expertise in React.js, JavaScript/TypeScript, Redux (or other state management libraries), REST APIs, and modern front-end development practices . Immediate joiners will be given priority. Key Responsibilities Develop and maintain responsive, high-performance web applications using React.js. Collaborate with UX/UI designers and backend developers to deliver seamless user experiences. Write clean, reusable, and scalable code following best practices. Integrate front-end components with REST APIs and third-party services. Optimize applications for maximum speed, performance, and cross-browser compatibility. Debug, test, and fix issues to ensure high-quality deliverables. Stay updated with the latest trends in front-end technologies and contribute to continuous improvement. Required Skills & Qualifications 3-8 years of professional experience in front-end development. Strong proficiency in React.js, JavaScript (ES6+), HTML5, and CSS3 . Experience with Redux, Context API, or other state management libraries . Good knowledge of RESTful APIs integration and asynchronous programming. Familiarity with TypeScript, Webpack, Babel, Git, and version control systems . Knowledge of responsive design principles and cross-browser compatibility . Experience with unit testing (Jest, React Testing Library, etc.) is a plus. Strong problem-solving, debugging, and communication skills.
Role & responsibilities Job Summary: We are seeking a skilled and motivated Data Engineer with hands-on experience in Azure Databricks , PySpark , and cloud-based data solutions. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions that support business decision-making and digital transformation initiatives. Key Responsibilities: Design and develop robust, scalable data pipelines using PySpark on Azure Databricks . Implement ETL/ELT processes to ingest, transform, and load data from various sources into data lakes or data warehouses. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. Optimize data workflows for performance, reliability, and cost-efficiency. Ensure data quality, integrity, and governance across all data platforms. Monitor and troubleshoot data pipelines and workflows in production environments. Work with Azure services such as Data Lake Storage , Azure Synapse , Azure Data Factory , and Azure DevOps . Maintain documentation for data architecture, processes, and best practices. Preferred candidate profile Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or related field. 3+ years of experience in data engineering or big data development. Strong proficiency in PySpark and Apache Spark . Hands-on experience with Azure Databricks and other Azure data services. Proficient in SQL and working with relational and non-relational databases. Experience with CI/CD pipelines and version control (e.g., Git, Azure DevOps). Familiarity with data modelling, data warehousing, and performance tuning. Excellent problem-solving and communication skills. Preferred Qualifications: Experience with Delta Lake , MLflow , or Unity Catalog . Knowledge of data governance and security best practices in cloud environments. Certifications in Azure (e.g., Azure Data Engineer Associate) are a plus
About the Role We are seeking a highly skilled and experienced Power BI Engineer to join our Data & Analytics team. You will play a pivotal role in designing, developing, and optimizing advanced Power BI reports and dashboards that empower business users with actionable insights. The ideal candidate will have deep expertise in Power BI, data modelling, and visualization, combined with a strong understanding of modern cloud data platforms, especially within the Azure ecosystem. Key Responsibilities Design, develop, and maintain scalable and high-performing Power BI reports and dashboards based on the Global Data Warehouse and Lakehouse environment. Collaborate closely with data engineers, data scientists, and business stakeholders to translate business requirements into technical BI solutions. Optimize Power BI datasets leveraging Azure Synapse Analytics and Databricks-processed data to ensure efficient query performance. Develop and maintain robust data models, DAX calculations, and custom visualizations in Power BI to deliver actionable insights. Implement best practices for report design, data security (Row-Level Security), and governance to ensure compliance and data integrity. Troubleshoot, debug, and resolve performance and data quality issues in Power BI reports and datasets. Mentor and provide technical guidance to junior BI developers and analysts. Stay up to date with the latest Power BI features and Azure Synapse ecosystem enhancements to continuously improve BI solutions. Support end-user training and documentation to promote self-service BI adoption. Required Qualifications Bachelors or masters degree in Computer Science, Information Systems, Data Science, or a related field. 5+ years of experience in Power BI report development and data visualization. Strong proficiency in Power BI Desktop, Power Query (M), DAX, and Power BI Service. Hands-on experience with Azure Synapse Analytics, Azure Data Factory (ADF), and Databricks. Deep understanding of data warehousing concepts, dimensional modelling, and ETL/ELT processes. Experience optimizing performance of Power BI reports connected to large-scale data warehouses and lakehouses. Knowledge of security implementations within Power BI, including Row-Level Security (RLS) and workspace permissions. Strong SQL skills for data querying and debugging. Excellent problem-solving skills and ability to work effectively with cross-functional teams. Strong communication skills to engage with business users and technical teams. Preferred Qualifications Microsoft Power BI Certification (e.g., DA-100 / PL-300). Experience with other Azure data services (Azure Data Lake Storage, Azure Synapse Pipelines). Familiarity with Python or Spark for data processing in Databricks. Exposure to Agile development methodologies and CI/CD pipelines for BI. Education Qualification BE / BTech / MCA
ITC Infotech is seeking a seasoned SAP FI Consultant with strong expertise in SAP S/4HANA Finance . The ideal candidate should have led at least 2 end-to-end S/4HANA implementations and have deep domain knowledge in financial accounting and reporting. This is a strategic role in delivering complex finance transformation programs for global clients. Key Responsibilities Implementation & Configuration: Lead and deliver end-to-end SAP S/4HANA Finance implementations . Configure core SAP FI modules: General Ledger, Accounts Payable, Accounts Receivable, Asset Accounting , and Bank Accounting . Integrate FI with other modules (CO, MM, SD) for seamless process flows. Develop and validate functional specifications for custom developments. Business Process Alignment: Understand and document business requirements, conduct FIT-GAP analysis . Facilitate design workshops and create solution blueprints. Advise clients on best practices and S/4HANA innovations like Universal Journal and Fiori apps . Testing & Delivery: Drive unit testing, system integration testing , and UAT . Support data migration , cutover activities , and hypercare post-go-live . Provide subject matter expertise during all phases of the project. Required Skills & Qualifications Minimum 10 years of experience in SAP FI , with 3+ years in S/4HANA . Mandatory: At least 2 full lifecycle implementations in S/4HANA Finance . Deep knowledge of: New GL, Parallel Ledgers, Document Splitting Accounts Payable, Receivable, Asset Management Bank Communication and Reconciliation Familiar with SAP Activate methodology and Agile delivery frameworks . Skilled in writing functional specs and coordinating with ABAP teams. Excellent communication and client-facing skills.
We are looking for an experienced SAP PP QM Consultant with strong exposure to S/4HANA implementation projects. The ideal candidate will have completed at least 2 full-cycle S/4HANA end-to-end implementations. Key Responsibilities: Lead the design, configuration, and deployment of SAP PP and QM processes within S/4HANA . Conduct business requirement analysis , prepare functional specs, and design scalable solutions aligned with industry best practices. Participate in 2 or more full lifecycle implementations of S/4HANA blueprinting, realization, testing, go-live, and support. Customize and configure SAP PP (Discrete, Process, Repetitive Manufacturing) and QM (Inspection Planning, Quality Notifications, Quality Certificates, etc.) . Integrate PP-QM with other modules like MM, SD, WM, and EWM. Work collaboratively with cross-functional teams, developers, and business users. Perform unit testing, integration testing , and support UAT . Provide cutover planning, data migration support , and hypercare. Create end-user training documentation and deliver training sessions. Provide post-implementation support and system enhancements. Support continuous improvement initiatives in the supply chain and manufacturing domain. Required Skills & Experience: 10+ years of experience in SAP PP and QM modules. Minimum of 2 full-cycle S/4HANA end-to-end implementations . Strong understanding of S/4HANA PP and QM functionality , Fiori apps, and simplifications.
We are urgently hiring Power BI Developer for one of our clients (a large IT service company) Key Responsibilities: Power BI Design and develop interactive Power BI dashboards and reports for business decision-making. Build efficient data models and write DAX for advanced calculations and KPIs. Optimize report performance and maintain workspaces for scalability and usability. Fabric Ecosystem Connect Power BI to Fabric data sources such as OneLake, Lakehouse, and Data Warehouse. Use DirectLake and DirectQuery to enable high-performance reporting on large datasets. Navigate Fabric workspaces to access, organize, and consume enterprise data securely. SQL Write complex SQL queries to prepare and transform data for reporting. Create and maintain database views to support analytics use cases. Optimize SQL code for performance and reliability in reporting scenarios. Other Responsibilities Collaborate with business stakeholders to gather requirements and deliver actionable insights. Ensure data quality, governance, and security across Power BI and Fabric workspaces. Stay current with Power BI and Microsoft Fabric capabilities to continuously improve solutions. Work Shift This role requires working in the UK shift to align with business stakeholders and project needs.