Jobs
Interviews

389 Aggregations Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

ahmedabad, gujarat

On-site

Job Opening : MERN Stack Developer Location : Ahmedabad, Gujarat (On-site) Company : Mekanism Technologies Experience : 3+ Years Job Overview : We are looking for an experienced MERN Stack Developer with 3+ years of hands-on experience in developing scalable and high-performance web applications using MongoDB, Express.js, React.js, and Node.js. The ideal candidate should be passionate about full-stack development, have strong problem-solving skills, and a deep understanding of both front-end and back-end technologies. You will work closely with cross-functional teams to build, enhance, and maintain end-to-end web solutions. Key Responsibilities: -Design, develop, and maintain full-stack web applications using the MERN stack. -Build reusable components and front-end libraries using React.js and JavaScript (ES6+). -Develop robust back-end services and RESTful APIs using Node.js and Express.js. -Manage and optimize database operations using MongoDB, including schema design and performance tuning. -Ensure responsiveness, cross-browser compatibility, and mobile-friendly UI. -Work collaboratively with designers, developers, and product managers in an Agile environment. -Integrate third-party APIs and services as required. -Perform code reviews, write clean and maintainable code, and follow best practices. -Troubleshoot, debug, and enhance existing applications. -Contribute to system architecture and technical documentation. -Mentor junior team members and promote a culture of learning and growth. Required Skills & Experience : -Proficient in JavaScript (ES6+), TypeScript, HTML, and CSS. -Strong hands-on experience with React.js, including hooks, state management, and component design. -Experience in building RESTful APIs using Node.js and Express.js. -Proficient in MongoDB (CRUD operations, aggregations, schema design). -Familiarity with state management tools like Redux or Context API. -Experience with Git version control and collaboration tools. -Solid understanding of web performance optimization and security best practices. -Comfortable using tools like VSCode, Postman, and browser dev tools. -Familiar with OOP/OOD principles and MVC architecture. -Knowledge of unit testing and testing libraries (e.g., Jest, Mocha, Chai). Preferred Skills : -Experience with Next.js for SSR and static site generation. -Familiarity with GraphQL, WebSockets, or Socket.IO for real-time data. -Exposure to CI/CD pipelines, Docker, or cloud services (AWS, Azure). -Experience with modern build tools like Webpack, Babel, or Vite. Soft Skills : -Excellent problem-solving and analytical skills. -Strong communication and teamwork abilities. -Open to feedback and continuous self-improvement. -Passionate about learning new technologies and sharing knowledge. -Ability to work effectively under pressure and meet deadlines. Fill out this Google form before we proceed with your application: https://forms.gle/irgNMPxFubQqM7Vz7 Job Type: Full-time Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

coimbatore, tamil nadu, india

On-site

Job Description We are looking for a BI Tester with strong expertise in report testing, SQL queries, and data validation . The ideal candidate will be responsible for validating data accuracy, ensuring report/dashboard functionality, and supporting BI projects with thorough testing processes. Key Responsibilities Perform BI Testing to validate data accuracy, transformations, and reporting outputs. Conduct report and dashboard testing across BI tools (Power BI, Tableau, SSRS, or equivalent). Write and execute complex SQL queries to validate data between source systems, staging, and reporting layers. Perform data validation, reconciliation, and regression testing . Identify, analyze, and report defects; work closely with developers and business analysts to resolve issues. Validate ETL processes , data warehouse loads, and downstream reporting. Support functional, regression, performance, and user acceptance testing (UAT) . Document test plans, test cases, and maintain test evidence as per standards. Required Skills Strong knowledge of BI Testing and Reports Testing . Proficiency in SQL queries (joins, aggregations, subqueries, performance tuning). Experience in validating reports/dashboards in Power BI, Tableau, QlikView, SSRS or similar BI tools. Hands-on experience with ETL/Data Warehouse testing . Strong understanding of data quality, reconciliation, and migration testing . Good communication and analytical skills.

Posted 1 week ago

Apply

3.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Role Description Functional Support Perform detailed business impact analysis for campaign and customer data issues. Provide techno-functional support for transactional template management and campaign data segregation. Maintain and optimize data aggregations to support customer segmentation and targeting. Incident Management Troubleshoot and resolve platform/application issues related to configuration and integrations. Collaborate with internal teams and external partners to ensure timely resolution of incidents. Service Requests Execute configuration changes for new integrations, email management, and template creation. Support business teams with platform setup and customization requests. Problem Management Conduct Root Cause Analysis (RCA) for recurring incidents and document findings. Recommend and implement preventive measures to reduce future occurrences. Monitoring Monitor APIs, workflows, and data extracts to ensure platform performance and data accuracy. Proactively identify and address anomalies or performance issues. Data Fixes Perform data corrections and compliance-related updates, including DSAR (Data Subject Access Requests). Enhancements Support onboarding of new partners and connectors. Assist in the development and deployment of new dashboards and reporting tools. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, or a related field. 3+ years of experience in software support customer data platforms Epsilon PeopleCloud. Strong understanding of customer data management, campaign operations, and platform configuration. Experience with incident and problem management tools (e.g., ServiceNow, Jira). Familiarity with API monitoring, workflow automation, and data extract processes. Excellent analytical, communication, and collaboration skills Skills Mandatory Skills : People Customer Cloud

Posted 1 week ago

Apply

5.0 years

8 - 9 Lacs

hyderābād

On-site

JOB DESCRIPTION Join us as we embark on a journey of collaboration and innovation, where your unique skills and talents will be valued and celebrated. Together we will create a brighter future and make a meaningful difference. As a Lead Data Engineer at JPMorgan Chase within the CCB (Connected Commerce), you are an integral part of an agile team that works to enhance, build, and deliver data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As a core technical contributor, you are responsible for maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Architect and oversee the design of complex data solutions that meet diverse business needs and customer requirements. Guide the evolution of logical and physical data models to support emerging business use cases and technological advancements. Build and manage end-to-end cloud-native data pipelines in AWS, leveraging your hands-on expertise with AWS components. Build analytical systems from the ground up, providing architectural direction, translating business issues into specific requirements, and identifying appropriate data to support solutions. Work across the Service Delivery Lifecycle on engineering major/minor enhancements and ongoing maintenance of existing applications. Conduct feasibility studies, capacity planning, and process redesign/re-engineering of complex integration solutions. Help others build code to extract raw data, coach the team on techniques to validate its quality, and apply your deep data knowledge to ensure the correct data is ingested across the pipeline. Guide the development of data tools used to transform, manage, and access data, and advise the team on writing and validating code to test the storage and availability of data platforms for resilience. Oversee the implementation of performance monitoring protocols across data pipelines, coaching the team on building visualizations and aggregations to monitor pipeline health. Coach others on implementing solutions and self-healing processes that minimize points of failure across multiple product features. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Extensive experience in managing the full lifecycle of data, from collection and storage to analysis and reporting. Proficiency in one or more large-scale data processing distributions such as JavaSpark along with knowledge on Data Pipeline (DPL), Data Modeling, Data warehouse, Data Migration and so-on. Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more modern programming languages Should have good hands-on experience on AWS services and its components along with good understanding on Kubernetes. Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Strong understanding of domain driven design, micro-services patterns, and architecture Overall knowledge of the Software Development Life Cycle along with experience with IBM MQ, Apache Kafka Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, LLMs etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Experience designing and building REST API services using Java Exposure to cloud technologies - knowledge on Hybrid cloud architectures is highly desirable. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.

Posted 1 week ago

Apply

0 years

0 Lacs

chennai

On-site

Intern - Data Analyst BOT VFX Chennai, India LBR Towers http://botvfx.com POSITION SUMMARY This is an internship role that requires competence in analyzing and interpreting data to support business decision-making. The role emphasizes strong SQL expertise, critical thinking, and core data analysis skills, with opportunities to gain hands-on exposure to automation and AI/ML concepts. Interns will work closely with the BI and data teams to extract insights, automate workflows, and explore predictive analytics. Position is on-site in Chennai. Required Skills: SQL basic knowledge on query writing, joins, aggregations, and optimization. An analytical mindset with strong problem-solving and critical thinking. Basic exposure to Python for data analysis and automation. Awareness of AI/ML fundamentals (regression, classification, clustering, etc.). Familiarity with BI tools such as Tableau and Looker is a plus. Project work includes: Extracting, cleaning, and analyzing datasets. Preparing dashboards and business reports. Identifying data trends, anomalies, and actionable insights. Supporting automation of reporting workflows. Learning and applying AI/ML basics in practical scenarios. Responsibilities include: Work with Mysql to extract, clean, and analyze datasets Support the creation of interactive dashboards and reports for stakeholders. Validate data accuracy and assist in identifying inconsistencies. Develop automation workflows using Python/Google Sheets/Excel VBA/Apps Script. Assist in exploratory data analysis and ML feature preparation. Maintain productivity while meeting deadlines and ensuring quality outputs. Technical Skills: SQL (Basics). Python or R (for analysis & automation). Google Sheets, Excel (Intermediate). BI/Visualization tools (Tableau, Looker). Basic ML libraries (Pandas, Scikit-learn, etc.) are a plus. These are not mandatory. Familiarity or willingness to explore the above tools is valued. Communication skills: Clear English written and verbal communication. Ability to present insights in a simple, structured manner. Team coordination and collaboration with cross-functional stakeholders. Ability to understand business requirements and translate them into data solutions. Behavioral : Great team player, resourceful, and eager to learn. Innovative mindset with willingness to experiment. Comfortable with challenging tasks and ambiguity. Strong sense of ownership and accountability. Detail-oriented with commitment to accuracy. Ability to listen, adapt to feedback, and propose thoughtful alternatives. ABOUT US BOT VFX is a renowned visual effects services company serving clients globally. With nearly 600+ team members, operations in Chennai, Coimbatore, Hyderabad, Pune, and Atlanta, and over a dozen years of operating experience, the privately held company has delighted clients with its blend of creative expertise, scale, and distinctive, quirky culture. It's also the winner of four FICCI BAF awards, and has a wide list of fans from Los Angeles to London and Montreal to Wellington.

Posted 1 week ago

Apply

5.0 years

0 Lacs

india

Remote

Role : Kafka Developer Exp Level : 5+ years Location : Remote Job Description : We are seeking a Kafka Developer with a strong focus on real-time data streaming and distributed systems. This role requires expertise in developing and managing robust data pipelines, primarily using Python, and includes significant Kafka administration responsibilities. You will collaborate with cross-functional teams to integrate Kafka-based solutions into existing systems and ensure smooth data streaming and processing. Additionally, you will monitor and optimize Kafka clusters to ensure high availability, performance, and scalability. You will implement data pipelines and streaming processes that support business analytics and operational needs, while troubleshooting and resolving any issues that arise. Ideal candidates will have a strong foundation in Apache Kafka, real-time data streaming, and proficiency in Java, Scala, or Python, as well as a solid understanding of distributed systems and microservices architecture. Key Responsibilities & Qualifications : • Develop & Optimize Data Streaming Solutions : Design, develop, and maintain high-performance data pipelines and real-time streaming applications primarily using Java with Apache Kafka, Kafka Streams, and Apache Flink. Experience with Python is a plus. • Kafka Administration & Integration : Install, configure, monitor, and optimize Kafka clusters. This includes extensive experience with Kafka Connect and specifically the MongoDB Connector for seamless data integration. • MongoDB Expertise : Design and optimize MongoDB schemas, develop complex queries, and manage MongoDB replication sets and sharded clusters for scalable data storage. • Software Engineering Excellence : Apply strong principles of Object-Oriented Programming (OOP), Test-Driven Development (TDD), and proven Software Design Patterns to deliver clean, maintainable, and scalable code. • Y : Utilize DevOps practices (CI/CD, Docker, Kubernetes) for automated deployments, and actively monitor and troubleshoot distributed systems to ensure high availability and performance. Kafka Streaming Developer Role : • Kafka Streams & Event Processing Experience • Experience in using real-time stream processing frameworks like Apache Flink, Spark, or ksqlDB. • experience in stateful processing (e.g., windowing, aggregations) • Experience with Schema Evolution & Avro/Protobuf • Experience with Data Transformation Pipelines : to work with real-time transformations, joins, and aggregations using Kafka Streams API or Flink/Spark. • Design, develop, and maintain real-time data streaming applications using Apache Kafka. • Collaborate with cross-functional teams to integrate Kafka solutions into existing systems. • Monitor and optimize Kafka clusters to ensure high availability and performance. • Implement data pipelines and streaming processes to support business analytics and operations. • Troubleshoot and resolve issues related to data streaming and processing. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. • Proven experience with Apache Kafka and real-time data streaming. • Proficiency in programming languages such as Java, Scala, or Python. • Familiarity with distributed systems and microservices architecture. • Strong problem-solving skills and the ability to work collaboratively in a team environment. Understanding of SOA, Object-oriented analysis and design, or client/server systems. Expert knowledge in REST, JSON, XML, SOAP, WSDL, RAML, YAML. Hands-on experience in large scale SOA design, development and deployment. • Experience with API management technology Experience working with continuous integration and continuous deliver tools (CI/CD) and processes.

Posted 1 week ago

Apply

0 years

0 Lacs

chennai, tamil nadu, india

On-site

BOT VFX Chennai, India LBR Towers www.botvfx.com Position Summary This is an internship role that requires competence in analyzing and interpreting data to support business decision-making. The role emphasizes strong SQL expertise, critical thinking, and core data analysis skills, with opportunities to gain hands-on exposure to automation and AI/ML concepts. Interns will work closely with the BI and data teams to extract insights, automate workflows, and explore predictive analytics. Position is on-site in Chennai. Required Skills SQL basic knowledge on query writing, joins, aggregations, and optimization. An analytical mindset with strong problem-solving and critical thinking. Basic exposure to Python for data analysis and automation. Awareness of AI/ML fundamentals (regression, classification, clustering, etc.). Familiarity with BI tools such as Tableau and Looker is a plus. Project Work Includes Extracting, cleaning, and analyzing datasets. Preparing dashboards and business reports. Identifying data trends, anomalies, and actionable insights. Supporting automation of reporting workflows. Learning and applying AI/ML basics in practical scenarios. Responsibilities Include Work with Mysql to extract, clean, and analyze datasets Support the creation of interactive dashboards and reports for stakeholders. Validate data accuracy and assist in identifying inconsistencies. Develop automation workflows using Python/Google Sheets/Excel VBA/Apps Script. Assist in exploratory data analysis and ML feature preparation. Maintain productivity while meeting deadlines and ensuring quality outputs. Technical Skills SQL (Basics). Python or R (for analysis & automation). Google Sheets, Excel (Intermediate). BI/Visualization tools (Tableau, Looker). Basic ML libraries (Pandas, Scikit-learn, etc.) are a plus. These are not mandatory. Familiarity or willingness to explore the above tools is valued. Communication Skills Clear English written and verbal communication. Ability to present insights in a simple, structured manner. Team coordination and collaboration with cross-functional stakeholders. Ability to understand business requirements and translate them into data solutions. Behavioral Great team player, resourceful, and eager to learn. Innovative mindset with willingness to experiment. Comfortable with challenging tasks and ambiguity. Strong sense of ownership and accountability. Detail-oriented with commitment to accuracy. Ability to listen, adapt to feedback, and propose thoughtful alternatives. About Us BOT VFX is a renowned visual effects services company serving clients globally. With nearly 600+ team members, operations in Chennai, Coimbatore, Hyderabad, Pune, and Atlanta, and over a dozen years of operating experience, the privately held company has delighted clients with its blend of creative expertise, scale, and distinctive, quirky culture. It's also the winner of four FICCI BAF awards, and has a wide list of fans from Los Angeles to London and Montreal to Wellington.

Posted 1 week ago

Apply

3.0 years

0 Lacs

andhra pradesh, india

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In SAP technology at PwC, you will specialise in utilising and managing SAP software and solutions within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of SAP products and technologies. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. SAP Native Hana Developer Technical Skills Bachelor's or Master's degree in a relevant field (e.g., computer science, information systems, engineering). Minimum of 3 years of experience in HANA Native development and configurations, including at least 1 year with SAP BTP Cloud Foundry and HANA Cloud. Demonstrated experience in working with various data sources SAP(SAP ECC, SAP CRM, SAP S/4HANA) and non-SAP (Oracle, Salesforce, AWS) Demonstrated expertise in designing and implementing solutions utilizing the SAP BTP platform. Solid understanding of BTP HANA Cloud and its service offerings. Strong focus on building expertise in constructing calculation views within the HANA Cloud environment (BAS) and other supporting data artifacts. Experience with HANA XS Advanced and HANA 2.0 versions. Ability to optimize queries and data models for performance in SAP HANA development environment and sound understanding of indexing, partitioning and other performance optimization techniques. Proven experience in applying SAP HANA Cloud development tools and technologies, including HDI containers, HANA OData Services , HANA XSA, strong SQL scripting, SDI/SLT replication, Smart Data Access (SDA) and Cloud Foundry UPS services. Experience with ETL processes and tools (SAP Data Services Preferred). Ability to debug and optimize existing queries and data models for performance. Hands-on experience in utilizing Git within Business Application Studio and familiarity with Github features and repository management. Familiarity with reporting tools and security based concepts within the HANA development environment. Understanding of the HANA Transport Management System, HANA Transport Container and CI/CD practices for object deployment. Knowledge of monitoring and troubleshooting techniques for SAP HANA BW environments. Familiarity with reporting tools like SAC/Power BI building dashboards and consuming data models is a plus. HANA CDS views: (added advantage) Understanding of associations, aggregations, and annotations in CDS views. Ability to design and implement data models using CDS. Certification in SAP HANA or related areas is a plus Functional knowledge of SAP business processes (FI/CO, MM, SD, HR).

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

mysore, karnataka, india

Remote

Enkefalos Technologies LLP., believes in creating a supportive and inclusive environment where innovation thrives. Working with us means collaborating with industry experts who are passionate about AI and next-generation tech solutions. At Enkefalos, you’ll find opportunities for career growth, continuous learning and working on exciting projects that challenge you to push boundaries. If you’re ready to embark on a rewarding career in AI and tech, explore our current job opening and become part of a team, that’s driving change through advanced GenAI solutions. Together, we can shape the future of industries worldwide. Lead Data Engineer Location: Remote / Mysore Joining: Immediate Experience : 8 -10 years Responsibilities: We are seeking a highly skilled Lead Data Engineer to design, build and optimize scalable data pipelines and platforms. Will implement all cleansing, transformation and semantic modeling logic on Databricks using PySpark, targeting financial facts and dimensions from SAP manual dumps. This role involves leading a team of data engineers, collaborating with cross-functional teams and ensuring best practices in data architecture, governance and performance. Requirements: Databricks, ADF, ADLs, PySpark, Azure Synapse, Data Engineering on Azure Cloud RDDs, DataFrames, performance tuning Building gold‐layer data models for financial reporting Experience with complex joins, aggregations, GL hierarchies Version handling (Actuals vs Budget), currency conversions

Posted 1 week ago

Apply

0 years

0 Lacs

chennai, tamil nadu, india

On-site

About The Role We are looking for a Data Analyst Intern who is passionate about working with data, solving problems, and learning how insights can drive business impact. This role requires strong SQL skills, critical thinking, and core data analysis knowledge, with opportunities to gain hands-on exposure to AI/ML concepts and automation in analytics . Key Responsibilities Data Analysis & Reporting Work with SQL to extract, clean, and analyze datasets. Assist in preparing dashboards and reports for business teams. Explore data trends and highlight insights or anomalies. Problem Solving & Critical Thinking Understand business requirements and support data-driven decision-making. Validate data accuracy and assist in identifying root causes of inconsistencies. Present findings in a clear and structured way. Automation & Efficiency Support the creation of automated workflows for reporting. Learn and apply scripting (Python/Gsheets/Excel VBA/Apps Script) to reduce manual tasks. AI/ML Exposure Learn the basics of machine learning concepts (regression, classification, clustering). Assist in exploratory data analysis and feature preparation for ML models. Collaborate with Data Scientists/Analysts to integrate predictive insights into dashboards/reports. What We’re Looking For Strong knowledge of SQL (query writing, joins, aggregations, optimization). Good understanding of data analysis concepts (data cleaning, visualization, statistics basics). An analytical mindset with strong critical thinking and problem-solving ability. Basic exposure to Python/R for data analysis and automation. Awareness of AI/ML fundamentals (even academic-level is fine). Eagerness to learn, collaborate, and take ownership of projects. Preferred Qualifications Currently pursuing or recently completed a degree in Data Science, Computer Science,AI/ML , Statistics, Engineering, or related fields. Hands-on projects or coursework in data analytics/ML/automation will be an advantage. Familiarity with BI tools (Tableau, Looker, etc.) is a plus. What You’ll Gain Hands-on experience in real-world data analytics projects. Exposure to end-to-end analytics workflows, including automation. Opportunity to convert into a full-time role based on performance.

Posted 1 week ago

Apply

0 years

0 Lacs

noida, uttar pradesh, india

On-site

About Birlasoft: Birlasoft is a global technology company enabling “next-generation” digital transformation through expertise in Cloud, AI, Data, and enterprise solutions. Combining industry proficiency with advanced digital capabilities, it helps businesses accelerate change with speed, scale, and purpose, delivering “future-ready” solutions that enhance agility, resilience, and customer experience. Part of the CKA Birla Group and led by Chairman Mrs. Amita Birla, Birlasoft’s nearly 12,000 professionals drive innovation while building a diverse, inclusive, and learning-oriented culture. With a strong focus on sustainability and long-term value creation, Birlasoft transforms enterprises and communities, earning its reputation as a trusted partner and one of the best places to work About the Job : We're seeking a Software Engineering Operations Analyst to drive insights and operational excellence using Azure DevOps, Agile practices, and Power BI dashboards. This role demands strong analytical skills, dashboard automation, and a proactive mindset to support engineering leadership. Title: Lead Consultant (Power BI) Job Location: Noida Educational Background: Bachelor’s degree in computer science, Management Information Systems, Mathematics or related field is strongly preferred. Key Responsibilities: Design and automate Power BI dashboards using data from Azure DevOps (Boards, Repos, Pipelines) to track engineering KPIs like velocity, lead time, and DORA metrics. Ensure ADO data quality by maintaining queries, workflows, and running regular audits to improve consistency across teams. Drive Agile operational cadence by managing sprint/release calendars, backlog health reviews, and quarterly planning processes. Deliver executive reporting through weekly/monthly readouts and ad-hoc analysis, translating complex data into actionable insights. Support tooling administration for ADO and Power BI, including license management and light vendor coordination. Identify process bottlenecks through data analysis and lead continuous improvement initiatives. Coordinate compliance activities such as audit readiness, change logs, and security reporting tied to software releases. Skills Required: Azure DevOps (ADO): Advanced proficiency in Boards, Queries, Repos, Pipelines, and Analytics. Power BI: Strong skills in data modeling, Power Query, DAX, and dashboard automation. Agile Methodologies: Solid understanding of Agile, Scrum, and SAFe practices; experience with sprint planning and release management. Dashboarding & Reporting: Ability to design executive-level dashboards and scorecards; deliver clear data-driven insights. Data Analysis: Proficiency in Excel (Pivot Tables, Power Query, XLOOKUP), SQL for joins/aggregations, and API integration (ADO REST). Operational Excellence: Experience in managing cadence activities like backlog reviews, planning cycles, and compliance tracking. Communication: Strong storytelling with data; capable of creating concise executive summaries and documentation. Mindset: Ownership-driven, analytical, and solution-oriented with a “figure it out” attitude.

Posted 2 weeks ago

Apply

0 years

0 Lacs

noida, uttar pradesh, india

On-site

Role Overview: We are looking for a Sr. Associate/Asst Manager with strong skills in SQL and Dashboard creation to join our Last Mile team. You will be responsible for designing and building dashboards that provide actionable insights to support business decisions. The ideal candidate is analytical, detail-oriented, and passionate about turning raw data into clear visual stories. Key Responsibilities: Develop, maintain, and optimize dashboards using business intelligence tools (e.g., Tableau, Power BI, Looker) Write efficient SQL queries to extract and analyze data from relational databases Use Python for data cleaning, transformation, and basic automation tasks Work with cross-functional teams to understand data requirements and deliver meaningful visualizations Ensure accuracy, consistency, and quality of data presented in reports Identify trends, anomalies, and opportunities from data and communicate findings effectively Required Skills: Proficient in SQL (joins, aggregations, window functions) Experience in Python for data processing (Pandas, NumPy Hands-on experience with dashboarding tools (e.g., Quicksight, Tableau, Power BI, Google Data Studio). Strong data visualization and storytelling skills. Ability to interpret business needs and translate them into a technical solution Basic understanding of statistics or business metrics is a plus Bachelor’s degree in Computer Science, Statistics, Mathematics, Engineering, or a related field.

Posted 2 weeks ago

Apply

4.0 years

7 - 9 Lacs

hyderābād

On-site

Job Description: Job Title: Tableau Developer Location: Hyderabad Experience: 4 years Notice Period: Immediate to 15 days Budget: Negotiable based on experience Job Description: We are looking for a talented Tableau Developer to join our team in Hyderabad. As a Tableau Developer, you will be responsible for designing and developing high-quality dashboards and reports to help drive business decisions. Your ability to analyze data and create visualizations that tell a compelling story will be crucial to the success of the team. If you're passionate about data analytics and have hands-on experience with Tableau, we would love to hear from you! Key Responsibilities: Develop and maintain interactive Tableau dashboards and reports. Transform raw data into actionable insights for business users. Work closely with business stakeholders to understand reporting requirements. Integrate data from various sources (e.g., SQL, Excel, web services) into Tableau. Perform data cleaning, data transformations, and aggregations for accurate analysis. Ensure the accuracy of the data and reports by conducting thorough testing. Optimize Tableau workbooks and data models for performance and scalability. Collaborate with IT teams to ensure data quality and access management. Provide Tableau training and support to end users. Stay updated on new Tableau features and best practices to improve development processes. Required Skills and Qualifications: 4 years of experience in Tableau Development. Strong proficiency in designing, developing, and implementing Tableau dashboards. Hands-on experience with data extraction, transformation, and loading (ETL). Familiarity with SQL and relational databases for creating data connections. Good understanding of data visualization principles and best practices. Experience working with large datasets and performance optimization techniques. Knowledge of Tableau Server or Tableau Online. Strong problem-solving skills and attention to detail. Excellent communication skills and ability to work with cross-functional teams. Technical Skills: Tableau | SQL | Data Transformation | Data Visualization | ETL | Tableau Server | Data Modeling | Dashboard Design | Data Analysis | Python | Cloud-Based Solutions | Agile Methodologies Job Type: Full-time Pay: ₹750,000.00 - ₹950,000.00 per year Work Location: In person

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

pune, maharashtra, india

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Work with high performance decisioning data store for aggregations, custom views and orchestration of data on cloud. Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Collaborate with central teams (architecture, security, engineering, networks) Implement DevOps / Automation on GCP according to system requirement. Resolving all the technical and process blockers for cloud migration/adoption Ensuring deployments, patch management, repavement of infra is all under control. Maintain SLA’s , troubleshoot the workflow and ensure system is up and running all the time. Requirements To be successful in this role, you should meet the following requirements: Minimum experience of atleast 10+ years. Should have good experience in production support domain and should be flexible enough to work on production issues, even at odd hours. Should be capable enough to take leadership role like ITSO etc. Solid cloud knowledge especially Distributed Tech stack - Java, Unix, Windows, SQL/Oracle Experience in working large scale complex global program. Coding experience GCP Programming, (python preferred), BQ SQL Good Knowledge of Containers, Docker, Kubernetes (preferred) Knowledge of DevOps processes and automation (e.g. Jenkins pipelines) Knowledge on GCP like Big query, Bigtable, Dataproc, dataflow, pub-sub etc. preferred. Experience with ITIL framework (incident, problem, change management knowhow) Capable to work in a team that is located across multiple countries/regions. Willingness to adapt and learn new things. Takes ownership of tasks. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 2 weeks ago

Apply

2.0 years

1 - 2 Lacs

punjab, india

On-site

Primary Job Title: MIS Analyst About The Opportunity A growing technology & services player in the cloud communications / SaaS-enabled business-process sector, delivering data-driven operations and customer engagement solutions. We power operational decision-making through timely, accurate management information—helping teams scale performance and reduce cost. This on-site role is based in India and reports to the Operations / Finance leadership team. Role & Responsibilities Produce and maintain daily, weekly and monthly MIS reports and executive dashboards that track operational KPIs, SLA metrics and financial reconciliations. Extract, transform and consolidate data from ERP/CRM/telephony and internal systems using SQL, Excel and reporting tools to ensure single source of truth. Design automated reporting workflows (Excel macros/VBA, Power BI scheduled refreshes or lightweight ETL) to reduce manual effort and improve delivery cadence. Perform variance analysis and root-cause investigations; highlight trends, risks and opportunities with actionable insights for stakeholders. Maintain data quality: run reconciliations, identify discrepancies, correct issues and document data lineage and assumptions. Collaborate cross-functionally with Operations, Finance and Sales to refine KPIs, agree SLAs and deliver ad-hoc analysis for strategic decision-making. Skills & Qualifications Must-Have Bachelor’s degree in Commerce, Statistics, IT, Engineering or related field; 2+ years in MIS/reporting or data-operations roles. Advanced Excel skills (PivotTables, XLOOKUP/VLOOKUP, complex formulas) and experience building macros/VBA to automate tasks. Proficient in SQL for data extraction, joins, aggregations and basic performance tuning. Hands-on experience with dashboarding tools (Power BI or Tableau) or strong Excel dashboard experience. Strong attention to detail, analytical mindset and ability to translate data into clear business recommendations. Preferred Exposure to ERP/CRM/telephony platforms (e.g., SAP, Oracle, Zoho, Genesys) and familiarity with operational metrics for contact centers. Basic Python or scripting experience for data processing, and prior experience implementing simple ETL/automation pipelines. Prior experience working on-site in a fast-paced operations environment and comfortable collaborating with multiple stakeholders. Benefits & Culture Highlights On-site role with clear career path into Operations Analytics, Finance or Data Engineering streams. Hands-on ownership of critical dashboards that influence business decisions and measurable impact on SLAs and revenue. Collaborative, outcomes-driven culture with regular upskilling and cross-functional exposure. To apply, bring strong reporting discipline, proven automation skills and a bias for actionable insights. This role is ideal for a detail-oriented MIS professional who enjoys converting raw data into decision-ready intelligence in an on-site Indian operations setting. Skills: data,operations,reporting,excel,finance,automation

Posted 2 weeks ago

Apply

2.0 years

1 - 2 Lacs

jalandhar i, punjab, india

On-site

Primary Job Title: MIS Analyst About The Opportunity A growing technology & services player in the cloud communications / SaaS-enabled business-process sector, delivering data-driven operations and customer engagement solutions. We power operational decision-making through timely, accurate management information—helping teams scale performance and reduce cost. This on-site role is based in India and reports to the Operations / Finance leadership team. Role & Responsibilities Produce and maintain daily, weekly and monthly MIS reports and executive dashboards that track operational KPIs, SLA metrics and financial reconciliations. Extract, transform and consolidate data from ERP/CRM/telephony and internal systems using SQL, Excel and reporting tools to ensure single source of truth. Design automated reporting workflows (Excel macros/VBA, Power BI scheduled refreshes or lightweight ETL) to reduce manual effort and improve delivery cadence. Perform variance analysis and root-cause investigations; highlight trends, risks and opportunities with actionable insights for stakeholders. Maintain data quality: run reconciliations, identify discrepancies, correct issues and document data lineage and assumptions. Collaborate cross-functionally with Operations, Finance and Sales to refine KPIs, agree SLAs and deliver ad-hoc analysis for strategic decision-making. Skills & Qualifications Must-Have Bachelor’s degree in Commerce, Statistics, IT, Engineering or related field; 2+ years in MIS/reporting or data-operations roles. Advanced Excel skills (PivotTables, XLOOKUP/VLOOKUP, complex formulas) and experience building macros/VBA to automate tasks. Proficient in SQL for data extraction, joins, aggregations and basic performance tuning. Hands-on experience with dashboarding tools (Power BI or Tableau) or strong Excel dashboard experience. Strong attention to detail, analytical mindset and ability to translate data into clear business recommendations. Preferred Exposure to ERP/CRM/telephony platforms (e.g., SAP, Oracle, Zoho, Genesys) and familiarity with operational metrics for contact centers. Basic Python or scripting experience for data processing, and prior experience implementing simple ETL/automation pipelines. Prior experience working on-site in a fast-paced operations environment and comfortable collaborating with multiple stakeholders. Benefits & Culture Highlights On-site role with clear career path into Operations Analytics, Finance or Data Engineering streams. Hands-on ownership of critical dashboards that influence business decisions and measurable impact on SLAs and revenue. Collaborative, outcomes-driven culture with regular upskilling and cross-functional exposure. To apply, bring strong reporting discipline, proven automation skills and a bias for actionable insights. This role is ideal for a detail-oriented MIS professional who enjoys converting raw data into decision-ready intelligence in an on-site Indian operations setting. Skills: data,operations,reporting,excel,finance,automation

Posted 2 weeks ago

Apply

14.0 - 16.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Position : Project Manager (Scrum Master) Experience range: 14-16 years Location: Noida/Bengaluru/Pune What you bring Experience in engineering operations, PMO/portfolio analytics, BI, or similar. Azure DevOps: Advanced Boards/Queries/Analytics; comfortable with repos/pipelines concepts. Power BI: Strong data modeling, Power Query, and DAX; ability to automate refresh and publish. Excel power user: Pivot tables, Power Query, functions (e.g., XLOOKUP), data cleansing. Data skills: SQL for joins/aggregations; ability to connect APIs (ADO REST) is a plus. Delivery savvy: Understanding of Agile/Scrum/SAFe practices and CI/CD fundamentals. Communication: Clear storytelling with data; concise executive summaries. Mindset: Ownership, speed, and a practical “figure it out” attitude. Nice to have Scripting (Python/R) for data wrangling or automation. Experience with Confluence/SharePoint, ServiceNow/Smartsheet, or OKR tooling. Background in enterprise software or product engineering environments.

Posted 2 weeks ago

Apply

0 years

0 Lacs

india

Remote

📊 Data Analyst Internship | Remote | Real-World Projects Do you love working with data , writing SQL queries , and uncovering insights that drive smart decisions? Skillfied Mentor is offering a remote Data Analyst Internship designed to give you hands-on, industry-level experience while building a standout portfolio . 📌 About the Role Position: Data Analyst Intern Location: 🌍 Remote (Work from Anywhere) Duration: ⏳ Flexible (minimum commitment required) Stipend: 💼 Performance-based (top performers rewarded) Deadline to Apply: 📅 1st September 2025 🔑 Your Responsibilities Work on real-world datasets across industries Write & optimize SQL queries for data extraction & transformation Perform data analysis to identify trends & insights Build dashboards/reports to communicate findings Collaborate with mentors & peers in a remote team ✅ Who We’re Looking For Knowledge of SQL (joins, aggregations, subqueries) Understanding of relational databases & data structures Strong interest in analytics, BI & data storytelling Bonus: Familiarity with Excel / Power BI / Tableau Self-motivated , detail-oriented , eager to learn 🎯 What You’ll Gain Practical experience with live datasets Mentorship & guidance from professionals Flexible working hours to match your pace Performance rewards + stipend opportunities Internship certificate + LOR for achievers A strong project portfolio to impress employers 📬 Apply Now Don’t miss this opportunity to boost your career in data analytics . 📅 Last date: 1st September 2025 – Apply today and start your journey!

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

india

Remote

Senior RPA Developer – UiPath 📍 Location: Bangalore - Remote About the Role We are seeking an experienced Senior RPA Developer to design and implement enterprise-grade automation solutions using UiPath. The ideal candidate will have strong expertise in RPA frameworks, intelligent document processing, and database integrations, with the ability to deliver scalable and robust solutions. Key Responsibilities Design, develop, test, and deploy RPA solutions using UiPath Studio, Orchestrator, and REFramework . Implement UiPath Document Understanding with OCR-based extraction for structured, semi-structured, and unstructured documents. Build and integrate machine learning models via UiPath AI Center into automation workflows. Develop and optimize SQL queries, stored procedures, and database-driven automations . Configure and manage Orchestrator assets, queues, triggers, and robots. Create reusable components and templates for scalable automation. Implement error handling, logging, and exception management strategies. Collaborate with business analysts and stakeholders to gather requirements and translate them into technical designs. Provide post-deployment support, performance monitoring, and workflow enhancements. Required Skills & Qualifications Bachelor’s degree in Computer Science, IT, Engineering, or related field ( Master’s degree preferred ). 4+ years of software development experience with focus on RPA, automation, or scripting. Strong expertise in UiPath Studio, REFramework, and Orchestrator . Proficient in UiPath Document Understanding (Document Manager, Taxonomy Manager, Data Extraction, Validation Station). Experience with OCR tools (OmniPage, Tesseract, Google Vision OCR, etc.). Solid knowledge of UiPath AI Center (data labeling, model training, ML skill deployment). Proficiency in SQL : complex queries, joins, aggregations, schema design. Strong understanding of API integrations (REST/SOAP, OAuth, API keys). Familiarity with Git and best practices in scalable RPA design. Ability to prepare and maintain technical documentation (SDDs, PDDs, user manuals). Strong problem-solving, debugging, and analytical skills. Excellent communication and collaboration skills, with Agile/Scrum experience. Why Join Us? Opportunity to work on cutting-edge automation projects . Collaborative and growth-oriented work culture. Exposure to enterprise clients and advanced RPA solutions. 👉 If you are passionate about automation and want to drive digital transformation, apply now!

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

mumbai, maharashtra, india

On-site

About Citi Research: Citi is a respected global leader in Investment Research and Analysis, with more than 300 analysts who follow more than 3,200 companies in dozens of industry groups and with securities trading in over 50 countries. We publish thousands of research reports annually on subjects ranging from promising new biotechnology discoveries to the dynamics of paper pricing in the global marketplace. Citi’s equities business is one of the largest and most established, both regionally and globally. Our clients include the world’s leading institutional investors and high net worth individuals, and we provide these clients with value-added, independent, insightful and actionable investment advice. The Global Research Center (GRC) of Citi Research in Mumbai is a full research office working with Equities Research, Commodities Research, Rates Strategy and Equity Quant Research. It also handles various critical Research Operations functions such as Supervisory Analyst, Research Distribution, Modeling and Risk Management. Job Summary: Citi’s US Equity Strategy team is seeking a Mumbai-based Senior Associate. The primary role will be to assist the team in data aggregation and analysis is support of its broad US equity focus but with a related emphasis on its ETF research offering. This is a unique opportunity to work with one of Wall Street’s leading US Equity Strategy teams and develop significant expertise in the equity strategy discipline while making an important contribution to our product. About The Role: The Senior Associate is expected to collect, outline and explain information giving insights that drive creative solutions in the Macro environment in which our clients’ businesses operate. Their insights provide clarity the client needs to make investment decisions. Interact with clients internal and external as requested by Sales & Trading; Address client requests Stay informed on developments that have a bearing on the macro environment Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Responsibilities- Maintain an ETF database, assist in updating a proprietary classification scheme, assist in analysis of ETF holdings, runs a monthly ETF flows report, support ongoing and related topical ETF research publications. Manage periodic reports such as the weekly PULSE, quarterly Sector and Industry Group Navigator (SIGN) and Size/Style Chartbook publications. Assist in building unique data sets which will serve as a cornerstone for ongoing US Equity Strategy reports. Here, our equity strategy publications are frequently in response to ongoing developments within the global macro environment. The associate will assist in building bottom-up aggregations of S&P 500 and SMID companies, research and assess relationships with a variety of macro inputs and contribute to other ad hoc requests. Skills: Strong understanding of financial concepts and financial statements. Strong Financial markets orientation required. Requires good analytical skills to filter, prioritize and validate potentially complex and dynamic material from multiple sources. Good data management and analysis skills and a high degree of comfort with Excel; should be comfortable in maintaining and updating existing models and databases. Good knowledge of external databases such as Bloomberg, Factset, etc. The role requires familiarity and/or expertise in programming languages (e.g. Python, VBA) with an ability to adapt to different data resources, such as Bloomberg, Factset, and Haver. The presumption is that the associate can quickly learn and work with Bloomberg Query Language. Related, an ability to analyze, assess, and manipulate data to identify ultimate investment implications is an ongoing expectation, and feature, of this role. Clear and effective communications both written and spoken. The associate will also assist in writing research drafts, responding to client inquiries as well as working with various internal constituencies. Strong interpersonal skills for collaborating with overseas colleagues (significant interactions with US based team members). Ability to work under pressure, with minimal supervision, and in a team environment. The ideal candidate will be highly detail-oriented and will possess strong organizational skills The candidate should be a pragmatic "self-starter"; who can work both independently or as part of a team. Experience: 5 years Equity Research experience of which prior experience of at least 2 years in equity strategy or equity quant role. Prior experience of US Markets; Exposure to ETF research Education: Required- Bachelor’s degree in Math/Statistics/Engineering/Economics Preferred - MBA Finance from a premier academic institution Recommended - CFA certification ------------------------------------------------------ Job Family Group: Research ------------------------------------------------------ Job Family: Research Analysis ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 weeks ago

Apply

2.0 - 3.0 years

0 - 0 Lacs

lucknow

On-site

Business Analytics Trainer (Excel, SQL, Power BI, Tableau) – Freelance/Part-Time Location: Lucknow (Onsite, Classroom Training) Engagement: 15 Weeks | 3 Hour/Week (12 PM – 1 PM) | Thru, FRI, Sat Compensation: ₹800 – ₹1000 per hour (based on interview & experience) About the Role We are hiring a Freelance Business Analytics Trainer to deliver a structured MBA-level Business Analytics program . The course runs for 15 weeks , with 1-hour onsite sessions every week in Lucknow. The trainer will cover a wide range of topics including Excel, SQL, Power BI, Tableau, and Predictive Analytics with AI & Cloud trends . This role is ideal for professionals with industry or teaching experience who want to mentor management students in both academic frameworks and real-world applications . Key Responsibilities Conduct structured and engaging classroom sessions on: Excel for Business Analytics – formulas, dashboards, problem-solving SQL for Analytics – queries, joins, aggregations, advanced SQL Data Visualization (Power BI & Tableau) – dashboards, interactivity, storytelling Predictive Analytics & AI/Cloud – intro to predictive models, no-code AI, cloud trends Guide students through hands-on projects, dashboards, and case studies . Mentor students on applying analytics to solve real-world business problems . Support in capstone project execution and presentation . Requirements 2–3 years of relevant teaching/training or industry experience in Business/Data Analytics. Proficiency in Excel, SQL, Power BI, Tableau and familiarity with predictive analytics/AI concepts . Strong communication and mentoring skills to engage MBA students . Must be available onsite in Lucknow once a week (12–1 PM) . What We Offer Competitive pay: ₹800 – ₹1000 per hour . Short-term yet impactful engagement ( 15 weeks ). Opportunity to mentor MBA students and shape future business analysts. Blend of academic teaching and industry application . Skills & Keywords (for matching) Business Analytics Trainer, Business Analytics Faculty, Business Analytics Instructor, Data Analytics Trainer, Part-time Faculty – Analytics, Freelance Trainer – Business Analytics, Microsoft Excel, Advanced Excel, PivotTables, SQL (PostgreSQL, MySQL, SQLite), Power BI, Tableau, Predictive Analytics, AutoML, No-Code AI, Cloud Analytics, Azure, Google Cloud, IBM Watson, MBA Analytics Trainer, Case Studies, Capstone Project Mentorship, Business Intelligence (BI), Storytelling with Data, Communication Skills, Corporate Trainer, Curriculum Delivery. Job Types: Part-time, Contractual / Temporary, Freelance Contract length: 15 weeks Pay: ₹800.00 - ₹1,000.00 per hour Expected hours: 3 per week Benefits: Food provided Experience: exel power bi: 2 years (Preferred) Language: English (Required) Work Location: In person

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description You are a strategic thinker passionate about driving solutions in Analytics . You have found the right team. As a Analytics Solutions Associate in our Finance team, you will spend each day defining, refining and delivering set goals for our firm. Job Responsibilities Collaborate with finance stakeholders to understand business needs and translate them into data requirements and Analytical solutions. Design, develop, and maintain interactive dashboards and reports using Tableau / ThoughtSpot. Utilize Python, Pyspark & SQL query and analyze data, build data transformation pipelines on Databricks. Work with business to develop required data models. Present complex data in visually appealing ways to aid communication. Collaborate with product managers and end users to ideate and iterate solutions. Ensure adherence to Analytical solution governance processes. Required Qualifications, Skills And Capabilities Bachelor’s degree and 6+ years of industry experience in a data-related discipline Hands-on practical experience with analytical solution design, development, testing, and operational stability, particularly data pipelines for moving/transforming data on Databricks. Demonstrable experience with Python and associated data manipulation libraries to describe data Advanced SQL (e.g., joins, aggregations, tuning). Significant experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis Strong interpersonal and communication skills and the ability to articulate complex technical concepts to senior audiences in presentations and written communications Understanding of Agile technological development methodology Preferred Qualifications, Skills And Capabilities Hands-on experience of Databricks, Tableau & Alteryx Excellent problem-solving skills and ability to work with tight deadlines. Organization skills, be able to prioritize task based on priority level. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction. The CCB Data & Analytics team responsibly leverages data across Chase to build competitive advantages for the businesses while providing value and protection for customers. The team encompasses a variety of disciplines from data governance and strategy to reporting, data science and machine learning. We have a strong partnership with Technology, which provides cutting edge data and analytics infrastructure. The team powers Chase with insights to create the best customer and business outcomes.

Posted 2 weeks ago

Apply

0 years

0 Lacs

india

Remote

📊 Data Analyst Intern | Remote | Performance-Based Rewards Are you passionate about working with data, writing SQL queries, and uncovering insights that drive smarter decisions? This internship is your chance to gain hands-on industry exposure by working on real-time projects that will sharpen your skills and enhance your portfolio. 🧠 Role: Data Analyst Intern 📍 Location: Remote / Virtual 🕒 Duration: Flexible (minimum commitment required) 💼 Stipend: Performance-Based (top performers rewarded) 📅 Last Date to Apply: 1st September 2025 🔧 Key Responsibilities: Work on real-world datasets across diverse industries Write, refine, and optimize SQL queries for data extraction & transformation Perform analysis to identify trends and provide actionable insights Build simple dashboards/reports to communicate findings effectively Collaborate with mentors and peers in a remote team environment ✅ Required Skills & Qualifications: Basic to intermediate knowledge of SQL (joins, aggregations, subqueries, etc.) Understanding of relational databases and data structures Strong interest in analytics, business intelligence, and storytelling with data Familiarity with Excel, Power BI, or Tableau is a plus Self-motivated, detail-oriented, and eager to learn 🎁 What You’ll Gain: Practical experience through real-time industry projects Mentorship and guidance from experienced professionals Flexible working hours for better learning and productivity Rewards & performance-based stipend opportunities Internship certificate + Letter of Recommendation for top achievers A strong project portfolio to showcase to future employers 📩 How to Apply: Submit your application before 1st September 2025 to secure your place.

Posted 2 weeks ago

Apply

0 years

0 Lacs

mumbai, maharashtra, india

On-site

About The Role We are looking for a skilled MERN Stack Developer to design and build scalable web applications. You will work across the stack, from database to front-end, ensuring high performance, security, and seamless user experiences. Key Responsibilities Design, develop, and maintain MERN (MongoDB, Express.js, React.js, Node.js) applications. Build reusable UI components using React.js (Hooks, Redux/Context API). Develop secure and optimized backend services with Node.js & Express.js. Design and manage database schemas, queries, and aggregations in MongoDB. Integrate and consume RESTful & GraphQL APIs. Implement authentication/authorization (JWT, OAuth2, SSO). Deploy applications on cloud platforms (AWS, Azure, GCP) with CI/CD pipelines. Ensure web security, scalability, and performance optimization. Collaborate with designers, QA, and product teams to deliver high-quality solutions. Write clean, maintainable, and testable code following industry best practices. Required Skills Strong proficiency in JavaScript (ES6+) and TypeScript. Expertise in React.js, Node.js, Express.js, MongoDB. Familiarity with Redux/Context API, React Router, Hooks. Hands-on experience with Mongoose ORM and database optimization. Strong understanding of REST & GraphQL. Knowledge of Docker, Kubernetes, Git, GitHub Actions/Jenkins (CI/CD). Experience deploying on AWS (EC2, S3, Lambda), Firebase, or Azure. Familiarity with Redis, Kafka, RabbitMQ for caching/messaging. Understanding of web security standards (XSS, CSRF, CORS). Preferred Skills Experience with Next.js or micro-frontends. Knowledge of Prisma/TypeORM. Familiarity with serverless architecture. Exposure to unit testing & automation frameworks (Jest, Mocha, Cypress). (ref:hirist.tech)

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Get to Know Us: Headquartered in Texas, CG Infinity is one of the fastest growing software service companies in the region with 300+ strong team members in Dallas, Houston, Albuquerque, Little Rock and New Delhi, India. The company offers solutions that are tailored to the needs of individual clients utilizing expertise in customer experience & CRM, application development & integration, production support & quality assurance, and data analytics & AI. CG Infinity’s mission is to grow talent and develop life-long relationships with its customers. The company has been featured on INC 5000 and The Best Places to Work in recent years. Website : http://www.cginfinity.com | https://www.linkedin.com/company/cginfinityinc/ Company size: 201-500 employees Headquarters: Dallas, Texas Founded: 1998 Our Culture: Our people-first approach to technology offers best-in-class service and success rates. Here are some of the main services that we offer at CG Infinity: Salesforce Implementations, Customer Experience & CRM, Application Development & Integration, Production Support & QA, and Data Analytics & AI. About the Role: We’re looking for a Business Analyst ready to grow into a Functional Consultant. This individual enjoys solving business problems, becoming a “power user” of new systems, and making a real impact by collaborating with others. Key Responsibilities: Learn and operate new applications, becoming a super-user and go-to person for the team Engage regularly with business users, technical teams, and stakeholders to gather requirements and refine system solutions Translate needs into functional specs and document processes clearly Write and optimize intermediate SQL queries—including joins and aggregates—for real-world reporting and troubleshooting needs Support testing, change management, and user training Actively participate in team planning, discussions, stand-ups, and retrospectives to keep everyone aligned and moving forward Help drive the adoption of new tools and workflows Qualifications: 1–2 years of Business Analyst experience (internships count) Proven communication skills—can clearly explain ideas, listen actively, document requirements, and adapt messaging to the audience (technical or non-technical) Track record of working effectively in teams, supporting group goals, and building positive working relationships Comfortable with SQL: able to write queries involving joins, aggregations, filtering, and basic data manipulation Fast learner with new digital tools and systems Nice-to-Have: Experience with ERP/CRM/SaaS platforms Exposure to Agile methodologies BA, SQL, or business application certifications Powered by JazzHR E3A3O4ymAN

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies