Jobs
Interviews

3627 Querying Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role As Director – Analytics and Insights, you will architect and lead the future of data-driven decision-making at Warner Bros. Discovery. In this high-impact, global leadership role, you will own the strategy, delivery, and enterprise scaling of advanced analytics, business insights, and next-generation business intelligence (BI) platforms. You will be responsible for establishing a modern, intelligent insights ecosystem that empowers thousands of business users to make faster, smarter, and more strategic decisions—at scale. Your remit spans traditional analytics, enterprise BI (Tableau, Power BI, MicroStrategy, SAP BO), self-serve insights, conversational BI, and AI-augmented analytics, all under a single, unified vision. This role demands a visionary leader who can blend data architecture with business intuition, scale operational excellence, and drive measurable business value through insights. You will directly influence how WBD leverages data to optimize global operations, delight audiences, and accelerate innovation across its vast content and media ecosystem. Enterprise Analytics & Insights Strategy Develop and execute a unified enterprise strategy for analytics and insights—anchored in business impact, speed-to-decision, and user empowerment. Define and lead strategic programs that embed analytics into content optimization, audience engagement, operational efficiency, marketing attribution, and revenue management. Transform the organization’s decision-making from reactive reporting to proactive, predictive, and prescriptive insights. Business Intelligence Platform Leadership Own the global BI & reporting function, overseeing development, governance, performance, and innovation across platforms like Power BI, Tableau, MicroStrategy, and SAP BusinessObjects. Lead the modernization and consolidation of BI tools and workflows to drive consistency, scalability, and cost-efficiency. Establish enterprise-wide KPI frameworks, reporting templates, and executive dashboards to align operational and strategic decision-making. AI-Enabled, Self-Service & Conversational BI Spearhead the implementation of AI-powered BI, including self-service analytics, conversational interfaces, and natural language querying. Drive adoption of augmented analytics capabilities—automated insights, anomaly detection, forecast narratives, and proactive alerts. Enable a federated, insight-driven culture by designing intuitive, role-based BI experiences for business users, creatives, and executives alike. Cross-Functional Leadership & Stakeholder Engagement Act as a strategic advisor to senior leadership, translating complex analytics into actionable business insights with measurable outcomes. Partner closely with data engineering, data governance, data science, product, content, ad sales, marketing, and finance teams to ensure alignment and impact. Embed a data-first mindset across the enterprise through training, advocacy, and thought leadership. People Leadership & Operational Excellence Lead, inspire, and grow a high-performing global team of analysts, BI developers, product owners, and insight consultants. Institutionalize delivery excellence by building reusable assets, scalable reporting templates, and high-impact insight playbooks. Monitor performance and adoption of analytics solutions and drive continuous improvement. Innovation, Governance & Future Readiness Champion innovation in decision intelligence by exploring cutting-edge techniques such as causal inference, simulation modeling, and AI-generated narratives. Define and enforce BI governance, access controls, and data quality frameworks to maintain integrity, trust, and compliance. Stay at the forefront of analytics trends and tools, ensuring WBD’s analytics stack evolves ahead of the curve. Qualifications & Experiences Master’s in Business Analytics, Data Science, Computer Science, Statistics, Economics, or related field. 10+ years of progressive experience in analytics, insights, and BI—ideally within media, entertainment, or direct-to-consumer businesses. At least 5+ years of proven leadership experience in managing large, global teams across analytics and BI disciplines. Deep expertise in enterprise BI platforms such as Power BI, Tableau, MicroStrategy, SAP BusinessObjects, and experience leading BI modernization programs. Proven success in deploying AI-enabled analytics, augmented BI, conversational interfaces, and self-serve data platforms at scale. Strong understanding of modern cloud data stacks (Snowflake, BigQuery), data modeling, SQL, and analytical architecture. Exceptional executive communication, data storytelling, and stakeholder management skills. A strategic thinker with a hands-on ability to connect analytics execution with tangible business results. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description: Quality Engineer (Data) Job Summary We are seeking a highly skilled Quality Engineer with 5-10 years of professional experience to ensure the integrity, reliability, and performance of our data pipelines and AI/ML solutions within the SmartFM platform. The ideal candidate will be responsible for defining and implementing comprehensive quality assurance strategies for data ingestion, transformation, storage, and the machine learning models that generate insights from alarms and notifications received from various building devices. This role is crucial in delivering high-quality, trustworthy data and intelligent recommendations to optimize facility operations. Roles And Responsibilities Develop and implement end-to-end quality assurance strategies and test plans for data pipelines, data transformations, and machine learning models within the SmartFM platform. Design, develop, and execute test cases for data ingestion processes, ensuring data completeness, consistency, and accuracy from various sources, especially those flowing through IBM StreamSets and Kafka. Perform rigorous data validation and quality checks on data stored in MongoDB, including schema validation, data integrity checks, and performance testing of data retrieval. Collaborate closely with Data Engineers to ensure the robustness and scalability of data pipelines and to identify and resolve data quality issues at their source. Work with Data Scientists to validate the performance, accuracy, fairness, and robustness of Machine Learning, Deep Learning, Agentic Workflows, and LLM-based models. This includes testing model predictions, evaluating metrics, and identifying potential biases. Implement automated testing frameworks for data quality, pipeline validation, and model performance monitoring. Monitor production data pipelines and deployed models for data drift, concept drift, and performance degradation, setting up appropriate alerts and reporting mechanisms. Participate in code reviews for data engineering and data science components, ensuring adherence to quality standards and best practices. Document testing procedures, test results, and data quality metrics, providing clear and actionable insights to cross-functional teams. Stay updated with the latest trends and tools in data quality assurance, big data testing, and MLOps, advocating for continuous improvement in our quality processes. Required Technical Skills And Experience 5-10 years of professional experience in Quality Assurance, with a significant focus on data quality, big data testing, or ML model testing. Strong proficiency in SQL for complex data validation, querying, and analysis across large datasets. Hands-on experience with data pipeline technologies like IBM StreamSets and Apache Kafka. Proven experience in testing and validating data stored in MongoDB or similar NoSQL databases. Proficiency in Python for scripting, test automation, and data validation. Familiarity with Machine Learning and Deep Learning concepts, including model evaluation metrics, bias detection, and performance testing. Understanding of Agentic Workflows and LLMs from a testing perspective, including prompt validation and output quality assessment. Experience with cloud platforms (Azure, AWS, or GCP) and their data/ML services. Knowledge of automated testing frameworks and tools relevant to data and ML (e.g., Pytest, Great Expectations, Deepchecks). Familiarity with Node.js and React environments to understand system integration points. Additional Qualifications Demonstrated expertise in written and verbal communication, adept at simplifying complex technical concepts related to data quality and model performance for diverse audiences. Exceptional problem-solving and analytical skills with a keen eye for detail in data. Experienced in collaborating seamlessly with Data Engineers, Data Scientists, Software Engineers, and Product Managers. Highly motivated to acquire new skills, explore emerging technologies in data quality and AI/ML testing, and stay updated on the latest industry best practices. Domain knowledge in facility management, IoT, or building automation is a plus. Education Requirements / Experience Bachelor’s (BE / BTech) / Master’s degree (MS/MTech) in Computer Science, Information Systems, Engineering, Statistics, or a related field.

Posted 4 days ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Primary Job Function This role will focus on implementing tools and strategies to analyze large amounts of data, identify trends, and convert information into business insights. The role will set up information formats and customized views for stakeholders across the company in various leadership, marketing and sales roles. Core Job Responsibilities Lead as a data-product owner translating business needs into data projects and data projects into business implications Partner with internal stakeholders---SFE/CRM/Marketing/ Ethical & Trade Sales/ MI/Finance to identify opportunities to implement data solutions to business problems Actively contribute to the business intelligence plan, BI environment and tools Build a strategic roadmap for Data & Analytics, including Data Science as a part of ANI India’s overall Customer/ Channel / Sales Force engagement/ Upstream-downstream strategy Build reports/models for forecasting, trending, Predictive analytics; Manage and execute ad-hoc reporting, dash boarding & analytics requirement Drive required data mining and present key strategic solutions/interpretations to business for real-time decision-making leveraging both traditional (e.g. data lake) as well as advanced (data science and AI) technologies & methodologies Promotes data-based storytelling abilities of summarizing and highlighting the points of analysis with effective visualization techniques through use of BI delivery platforms Work Experience 6+ years of experience Prior experience in Pharma / FMCG / FMHG will be an added advantage Strong knowledge of tools - querying languages (SQL, SAS, etc.), visualization (Tableau, Raw, etc.), and analytics (MS excel, Power BI, Adobe Analytics, etc.)

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Description Are you interested in innovating to deliver a world-class level of service to Amazon’s Selling Partners? At Amazon International Seller Services, our mission is to make sellers successful on Amazon. We are the team in Amazon which has a charter to work with sellers from every country. The Shared Services team helps new sellers onboard and improve existing seller performance by programmatic interventions. We provide necessary support to new Sellers starting from their launch on Amazon to removing blockers and setting them up for success by leveraging various programs and tools. Key job responsibilities Design, develop, and maintain scalable, automated, user-friendly systems, reports, dashboards etc. to support analytical and business needs. Analyze key metrics to uncover trends and root causes of issues to influence stakeholders and measure success. Incorporate business intelligence best practices, data management fundamentals, and analysis principles. Recommend and make improvements to team processes, metrics, internal tools, SOPs, and other workflows and build automations to reduce dependencies on manual tasks. Knowledgeable in a variety of strategies for querying, processing, persisting, analyzing and presenting data. Proficient in SQL. Maintain and refine straightforward ETL. Write secure, stable, testable, maintainable code with minimal defects. Basic Qualifications At least 3+ yrs of experience in Sales/Business Analyst role handing business reporting Bachelors or above degree with Data Analysis or similar related fields Strong analytical and quantitative skills with professional experience in SQL and at least one data visualization tool (e.g., PowerBI, Tableau, Amazon QuickSight) in a business environment Ability to handle data and metrics to back up assumptions and evaluate outcomes Professional written and verbal communication skills in English in a fast-paced and ambiguous environment Preferred Qualifications Skilled in collaborative environment, succeeding through regular meetings and clear formal and informal communication with members of the remote and local management teams. Impeccable attention to detail and passion for end-to-end processes' optimization Strong ownership and analytical mindset Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3027187

Posted 4 days ago

Apply

0 years

2 - 4 Lacs

Calicut

On-site

A solid understanding of application design using Laravel. Knowledge of database design and querying using SQL. Proficiency in HTML and JavaScript. Discussing project aims with the client and development team. Designing and building web applications using Laravel. Troubleshooting issues in the implementation and debug builds. Working with front-end and back-end developers on projects. Testing functionality for users and the backend. Ensuring that integrations run smoothly. Scaling projects based on client feedback. Recording and reporting on work done in Laravel. Maintaining web-based applications. Minimum 1 or above experience in Php Laravel Framework . Presenting work in meetings with clients and management. Job Types: Full-time, Fresher Pay: ₹20,000.00 - ₹35,000.00 per month Schedule: Day shift Monday to Friday

Posted 4 days ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III focused in Quality Assurance Engineering at JPMorgan Chase within the Wholesale Credit Risk Technology Data Team, you serve as a seasoned member of an agile team assisting in design, delivery and testing of trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in automated testing solutions Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages (SQL) BS/BA degree or equivalent experience Proven ability to write automated tests across front and backend Detailed understanding of common defect and data management tools Advanced knowledge of software lifecycles, including Waterfall and Agile, and test automation strategies Experience working effectively with teams and stakeholders to develop relationships and achieve common goals Proficiency in a business function and some understanding of the broader business context Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills , Exposure to cloud technologies ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Corporate Technology team relies on smart, driven people like you to develop applications and provide tech support for all our corporate functions across our network. Your efforts will touch lives all over the financial spectrum and across all our divisions: Global Finance, Corporate Treasury, Risk Management, Human Resources, Compliance, Legal, and within the Corporate Administrative Office. You’ll be part of a team specifically built to meet and exceed our evolving technology needs, as well as our technology controls agenda.

Posted 4 days ago

Apply

2.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorganChase within the Consumer and Community Banking, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Demonstrable ability to code in one or more languages Experience across the whole Software Development Life Cycle Strong experience in UI, React JS, Backend. Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Emerging knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US

Posted 4 days ago

Apply

3.0 years

1 - 10 Lacs

Hyderābād

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III focused in Quality Assurance Engineering at JPMorgan Chase within the Wholesale Credit Risk Technology Data Team, you serve as a seasoned member of an agile team assisting in design, delivery and testing of trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in automated testing solutions Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages (SQL) BS/BA degree or equivalent experience Proven ability to write automated tests across front and backend Detailed understanding of common defect and data management tools Advanced knowledge of software lifecycles, including Waterfall and Agile, and test automation strategies Experience working effectively with teams and stakeholders to develop relationships and achieve common goals Proficiency in a business function and some understanding of the broader business context Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills , Exposure to cloud technologies

Posted 4 days ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Consumer and Community Banking, you will be part of an agile team dedicated to enhancing, designing, and delivering software components for the firm's cutting-edge technology products in a secure, stable, and scalable manner. As a developing member of the software engineering team, you will implement software solutions through the design, development, and technical troubleshooting of various components within a technical product, application, or system, while acquiring the skills and experience necessary to advance in your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Demonstrable ability to code in one or more languages Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Proven experience in AWS Connect development and configuration. Strong expertise in Terraform for cloud infrastructure management. Proficiency in Java programming and application development. Design, develop, and maintain AWS Connect solutions to enhance customer engagement and communication workflows. Emerging knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Experience with AWS services such as Lambda, S3, EC2, IAM, and others. Integrate AWS Connect with other AWS services and third-party applications to optimize functionality. ABOUT US Chase is a leading financial services firm, helping nearly half of America’s households and small businesses achieve their financial goals through a broad range of financial products. Our mission is to create engaged, lifelong relationships and put our customers at the heart of everything we do. We also help small businesses, nonprofits and cities grow, delivering solutions to solve all their financial needs. We offer a competitive total rewards package including base salary determined based on the role, experience, skill set and location. Those in eligible roles may receive commission-based pay and/or discretionary incentive compensation, paid in the form of cash and/or forfeitable equity, awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility. These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more. Additional details about total compensation and benefits will be provided during the hiring process. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. Equal Opportunity Employer/Disability/Veterans ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.

Posted 4 days ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

India

On-site

Experience: 2 - 7 Years Added Night Shift Bonus Role Description This is a full-time Hybrid position for Production Support Engineer - Overnight Support, night shift role located in Hyderabad. The role will involve day-to-day tasks related to production support, analytical skills application, troubleshooting, and providing technical support. Qualifications: Application Support and Production Support skills Proven experience in production support roles, focusing on batch processing and overnight support. Strong proficiency in Linux and Windows operating systems. Experience in SQL querying and database management. Experience with batch scheduling tools (e.g., Cron, Control-M, Autosys) is preferred. Experience with monitoring tools (e.g., Nagios, Grafana) is preferred. Working knowledge in scripting languages such as Shell Scripting Perl or Python. Strong Communication skills. Responsibilities: Monitor and manage overnight batch processes to ensure timely completion and accuracy. Investigate and resolve batch failures, escalating issues as necessary to ensure prompt resolution. Develop and implement scripts (Shell Scripting, Python) to automate monitoring tasks and data collection. Perform routine system checks and maintenance tasks during non-business hours. Proactively monitor applications and batch jobs using dedicated monitoring tools and dashboards. Provide support for Linux and Windows environments, including troubleshooting and system administration tasks. Collaborate with development and infrastructure teams to implement solutions and enhancements to batch processes. Document and maintain standard operating procedures (SOPs) for batch support activities. Participate in on-call rotation schedules and respond to production incidents as needed. Work within a rotational night shift. Please make sure to document all monitoring activities, including identified issues, resolutions, and root cause analysis findings. Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹900,000.00 per year Benefits: Health insurance Provident Fund Application Question(s): Please attach LinkedIn profile? Shift availability: Night Shift (Required) Overnight Shift (Required) Work Location: In person

Posted 4 days ago

Apply

0 years

1 - 1 Lacs

Kanpur Nagar

Remote

Only Kanpur candidates Freshers preferred JOB DESCRIPTION – CUSTOMER SUPPORT EXECUTIVE – COMPLAINCE Duties and Responsibilities include but are not limited to the following: Job Description and Responsibility Pre-Transaction Monitoring Pre-Transaction Monitoring Querying with the clients, for the necessary documents as proof of actual trade for which payments are uploaded. Execution of Inward/Outward payments with proper Web Due Diligence on Remitter and Beneficiary (Consignee if involved) Coverage of full range of AML activities from Transaction Monitoring to Investigations Performing background screening checks on associated parties for the outward/inward payments processing using, Lexis Nexis, Ofac sanction list etc. Maintain a current understanding of money laundering and terrorist financing issues, including policies, procedures, regulations, industry best practice, criminal typologies and developing trends PEP and Sanction list screening - Screen individuals and entities against global sanction lists issued by OFAC, UN and other international and domestic government agencies Ensure efficient identification and monitoring of suspicious activities and transactions. Client On-Boarding Coordinate the end-to-end onboarding of new clients, ensuring a smooth transition from sales to active account status. Collect, review, and verify all required Know Your Customer (KYC) and Anti-Money Laundering (AML) documentation in accordance with regulatory requirements. Ensure accurate data entry and maintenance of client records in our system Liaise with internal teams (AML/CFT Compliance Risk,) to resolve documentation or approval delays and ensure timely activation of client accounts. Act as the main point of contact for new clients during the onboarding phase, providing timely updates and managing expectations. Prepare onboarding packs and ensure all agreements and service-level documents are properly signed and stored. Continuously assess the onboarding process for inefficiencies or risk exposures and suggest improvements. Maintain strong knowledge of applicable financial regulations, including local regulatory compliance. Background Screening: Performing background screening checks on related parties for the outward/inward payments processing using world check, Lexis Nexis, etc. Filing STR: Report and filing of STR to MLRO and other regulatory institutions based on the internal assessment of transaction rejected and also on the risk assessment of approval Additional Functions: Perform routine duties with minimal supervision using standard compliance practices & procedures Checking and tracking daily whether supporting documents have been received for Transaction Monitoring, Onboarding, and other Compliance processes. Reviewing the Invoice details, bill of lading and other supporting documents submitted by the Clients and in case of any query Cross verifying the Payment details with the Customer over the mail. Comply with all safety policies, practices, and procedures Participate in proactive team efforts to achieve the goals of the financial institution. Perform other duties as assigned Experience in managing priorities Experience communicating to different levels in an organization Experience working in a team environment with a track record of building relationships and working collaboratively Assisting the Lead in system enhancement and UAT testing Assisting the Lead in Adding/blocking Bank Swift codes in compliance system Adding HIGH RISK Names in Compliance system Assisting Compliance Team in Clearing the RED Flag payments from the RED FLAG Opinion Desired Skills & Experience Trainee Freshers - Graduate/ Postgraduate in business, finance, accounting, or related field Strong communication skills and capable of working in multi-lingual working environment Fluency in spoken and written English Basic Knowledge of Computers and Internet like Windows Operating System, Desktop, Ms. Word, Ms. Excel Creation of Email Account, Google search, Google Meetings etc. Strong Skills in MS Office Job Types: Full-time, Permanent, Fresher Pay: ₹14,000.00 - ₹15,000.00 per month Benefits: Health insurance Provident Fund Work from home Schedule: Rotational shift Language: English (Preferred) Hindi (Preferred) Work Location: In person Expected Start Date: 05/08/2025

Posted 4 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description We are seeking a skilled and experienced Java and Spring Boot and Elasticsearch Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-performance Java applications with a focus on Elasticsearch integration. The candidate should have a strong background in Java development, along with expertise in implementing and optimizing Elasticsearch : Java And Spring Boot Development Design, develop, and maintain robust and scalable Java applications. Collaborate with cross-functional teams to define, design, and ship new features. Ensure the performance, quality, and responsiveness of Integration : Implement Elasticsearch solutions for efficient data indexing, searching, and retrieval. Develop and optimize Elasticsearch queries to meet performance and scalability requirements. Troubleshoot and resolve issues related to Elasticsearch Review And Optimization : Conduct code reviews to ensure code quality and adherence to best practices. Identify and address performance bottlenecks and optimize code for maximum And Communication : Work closely with other developers, product managers, and stakeholders to deliver high-quality solutions. Communicate effectively with team members and provide technical guidance as : Education Bachelor's degree in Computer Science, Engineering, or a related : Proven experience in Java development with a minimum of 4 years of hands on experience including 2 years (or 2 recent projects) of strong hands on knowledge with full implementation of Elasticsearch and Spring Boot. Strong knowledge of Spring Boot and its ecosystem. Significant experience in designing and implementing Elasticsearch solutions. Strong expertise in Elasticsearch, including indexing, querying, and performance optimization. Experience with microservices architecture and RESTful API design. Experience with Spring Boot and RabbitMQ. Strong skills in In-memory applications, Database Design, Data Integration. Excellent relationship building and communication skills; ability to interact and work effectively with all Skills : Proficiency in Java programming language. Proficiency in Spring Boot. Experience with RESTful APIs and web services. Familiarity with relevant tools and frameworks. Strong in any of SQL Database. Strong knowledge of Elasticsearch, including indexing, querying, and performance tuning. Familiarity with GIT & version Preferred Skills : Experience with containerization technologies (e., Docker, Kubernetes). Knowledge of Micro services & any API Gateway. Knowledge of cloud platforms (e., AWS, Azure, or GCP). Familiarity with S3 bucket. Familiarity with message brokers (e., Rabbit (ref:hirist.tech)

Posted 4 days ago

Apply

15.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Position Summary We are seeking a skilled Lead Data Engineer to join our dynamic team. The Sr. Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users. Responsibilities Design and build scalable data pipeline architecture that can handle large volumes of data Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse Optimize and maintain the data infrastructure to ensure high availability and performance Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models Develop and maintain data models to support business needs Ensure data security and compliance with data governance policies Identify and troubleshoot data quality issues Automate and streamline processes related to data management Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture Analyze the data products and requirements to align with data strategy Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams Enhance the efficiency, automation, and accuracy of existing reports Follow best practices in data querying and manipulation to ensure data integrity Requirements Bachelor's or Master's degree in Computer Science, Data Science, or a related field Must have 15+ years of experience as a Data Engineer or related role Must have experience with Snowflake Strong Snowflake experience building, maintaining and documenting data pipelines Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features Strong SQL development experience including SQL queries and stored procedures Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic Well versed in data standardization, cleansing, enrichment, and modeling Proficiency in one or more programming languages such as Python, Java, or C# Experience with cloud computing platforms such as AWS, Azure, or GCP Knowledge of ELT/ETL processes, data warehousing, and data modeling Familiarity with data security and governance best practices Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes Strong communication and collaboration skills Minimum Work Experience 15 Maximum Work Experience 20 This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

💻 Data Analyst Intern – Remote | Real Projects, Real Skills 📍 Location: Remote (100% Virtual) 💼 Type: Internship (Unpaid) 🎁 Perks: Certificate after completion || Letter of Recommendation (6 month) 🕒 Schedule: 5–7 hours/week – Flexible Timing Skillfied Mentor is an IT services and consulting startup offering immersive, project-based virtual internships. Our mission is to help students and fresh graduates gain real-world experience in high-demand tech domains. This Data Analyst Internship is your opportunity to go beyond tutorials and apply what you learn to live datasets using tools like Excel, SQL, Python, Power BI, and Tableau . 🔧 What You’ll Do Collect, clean, and organize raw data for meaningful analysis Develop interactive dashboards with Power B and Tableau Use SQL for querying and manipulating large datasets Apply basic statistical methods to drive data-based decisions Present your findings in a simple, visual, and effective manner 🎓 What You’ll Gain ✅ Full Python course access included during internship ✅ Work on resume-worthy projects you can showcase ✅ Certificate after completion || Letter of Recommendation (6 month) ✅ Work with tools which companies actually use (SQL, Python, Power BI, etc.) ✅ Work from anywhere, on your own schedule (5–7 hrs/week) 🗓️ Application Deadline: 1st August 2025 👉 Apply now and start your journey as a Data Analyst Intern with Skillfied Mentor!

Posted 4 days ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY- Assurance – Staff –Data Analytics As part of our EY-Assurance Team, plays an integral role in contributing individually and adding value to the complex reporting tasks that help various internal and/or external constituencies develop deeper understanding of their respective markets, functional practices, and other internal clients. The opportunity We’re looking for an incumbent who will be responsible for the review of deliverables and ensuring that quality and productivity targets are met. Your Key Responsibilities Data Transformation: Use Alteryx and ETL techniques to extract data from various sources, transform it into a structured format, and load it into databases or data warehouses. Data Analysis: Perform in-depth data analysis to identify trends, patterns, and anomalies, and present findings in a clear and concise manner. SQL Queries: Write and optimize SQL queries to retrieve, manipulate, and analyse data from relational databases. Data Cleansing: Cleanse and pre-process data to ensure accuracy, consistency, and completeness. Data Visualization: Create visually appealing and insightful dashboards and reports using Power BI, presenting data in a meaningful way to stakeholders. Data Quality Assurance: Conduct data quality checks to ensure the accuracy and integrity of data and resolve any discrepancies or issues that may arise. Collaborate with Teams: Work closely with cross-functional teams to understand data requirements, provide data-driven insights, and support decision-making processes. Continuous Improvement: Stay up to date with industry trends, best practices, and new technologies to enhance data analysis capabilities. Soft Skills And Attributes For Success Excellent communication, project management and people skills Problem solving skills with quick learning ability and adaptability to change. Should be open to working in different time zones and travel as required. Should have high standard of integrity and confidentiality. Should be willing to work under tight timelines delivering good quality of work. Strong analytical skills with the ability to collect, organize, analyse, and disseminate significant amounts of information with attention to detail and accuracy. Ability to work within a matrix organization. Technical Skills With Hands On Experience Alteryx: Hands-on experience with Alteryx Designer, utilizing workflows for data blending, data preparation, and advanced analytics. ETL (Extract, Transform, Load): Proficiency in ETL processes and tools, extracting data from diverse sources, transforming it, and loading it into target systems. SQL: Strong command of SQL for querying, aggregating, and manipulating data from relational databases. Excel: Excellent knowledge of Excel, including advanced formulas, functions, pivot tables, and data visualization techniques. Data Visualization Tools: Hands on with data visualization tools like Power BI, Familiarity with Tableau is a plus. To qualify for the role, you must have. B.E / B. Tech. / M. Tech. / MCA in Computer Science or Information Technology with a techno functional background or accounting graduates / postgraduates having worked in business analytics domain. Self-driven and highly motivated individual with 2+ years of experience Experience in managing multiple concurrent initiatives from multiple regions or clients. A strong track record of successful delivery and benefits realization Ideally, you’ll also have Interest in business and commerciality. Flexibility to work in different time zones and travel as required. What We Look For A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment. An opportunity to be part of a market-leading, multi-disciplinary team in the only integrated global assurance business worldwide. Opportunity to work with EY GDS Assurance practices globally with leading businesses across a range of industries. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland, and the UK – and with teams from all EY service lines, geographies, and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills, and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 2 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Quality Engineer, you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suites, creating automation strategies, and supporting data and environment configurations. You will also participate in code reviews and monitor defects to support continuous improvement activities for the end-to-end testing process, ensuring that the highest quality standards are met throughout the project lifecycle. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement automated testing scripts to enhance testing efficiency. - Collaborate with cross-functional teams to ensure seamless integration of testing processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse. - Good To Have Skills: Experience with data integration tools and ETL processes. - Strong understanding of database management and SQL querying. - Familiarity with continuous integration and continuous deployment (CI/CD) practices. - Experience in performance testing and monitoring tools. Additional Information: - The candidate should have minimum 2 years of experience in Snowflake Data Warehouse. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Responsive Responsive (formerly RFPIO) is the global leader in strategic response management software, transforming how organizations share and exchange critical information. The AI-powered Responsive Platform is purpose-built to manage responses at scale, empowering companies across the world to accelerate growth, mitigate risk and improve employee experiences. Nearly 2,000 customers have standardized on Responsive to respond to RFPs, RFIs, DDQs, ESGs, security questionnaires, ad hoc information requests and more. Responsive is headquartered in Portland, OR, with additional offices in Kansas City, MO and Coimbatore, India. Learn more at responsive.io. About the Role The Senior Data Analyst - Operations for a SaaS company plays a pivotal role in leveraging data to optimize operational processes, drive efficiency, and improve overall business performance. This role involves gathering, analyzing, and interpreting operational data to identify trends, reduce inefficiencies, and support strategic decision-making. The Senior Data Analyst will work closely with cross-functional teams such as product, marketing, sales, customer success, finance, and engineering to ensure the company's operational goals are data-driven and aligned with business objectives. The ideal candidate will have experience in SaaS operations and a strong background in data analysis, using advanced analytics tools to deliver actionable insights. Essential Functions Operational Data Collection & Management: Gather, clean, and organize data from various sources including CRM systems, customer usage data, billing platforms, and internal operational tools. Ensure the integrity, accuracy, and consistency of operational data across departments. Collaborate with IT and data engineering teams to optimize data pipelines, ETL processes, and data storage systems. Data Analysis & Insights: Analyze operational data to identify inefficiencies, trends, and opportunities for improvement, focusing on areas like customer onboarding, product usage, renewals, and churn. Provide data-driven recommendations to streamline SaaS operations, reduce costs, and improve product delivery efficiency. Conduct root cause analysis to understand operational bottlenecks and provide solutions to improve performance. Performance Metrics & KPI Tracking: Define and track key performance indicators (KPIs) related to SaaS operations, including customer satisfaction (NPS), support resolution times, system uptime, and subscription renewal rates. Regularly report on operational KPIs to senior leadership, translating data into actionable insights. Develop and maintain dashboards and visualizations using tools such as Tableau, Power BI, or Looker to give teams real-time visibility into operational performance. Process Improvement & Operational Efficiency: Collaborate with the operations, finance, and product teams to identify process inefficiencies and recommend solutions for process improvements. Use data to support initiatives aimed at improving the SaaS customer lifecycle, from acquisition through retention and churn reduction. Analyze customer behavior patterns to optimize customer success efforts and improve engagement with the platform. Forecasting & Predictive Analytics: Use statistical models and machine learning techniques to forecast operational trends, such as customer churn, revenue growth, and support demand. Provide forward-looking insights to support resource planning, including customer support staffing, infrastructure scaling, and demand forecasting. Develop and improve predictive models to optimize decision-making in operations. Automation & Technology Integration: Identify and implement automation opportunities in operational workflows to reduce manual effort and increase scalability. Work with engineering and IT teams to integrate operational data from various SaaS tools (e.g., Salesforce, Gainsight, Zendesk ) into centralized analytics platforms. Stay updated on new technologies and tools that can enhance operational efficiency and data analysis capabilities. Collaboration & Stakeholder Engagement: Partner with teams across the organization, including customer success, product management, finance, and engineering, to align data initiatives with business objectives. Present findings, reports, and recommendations to senior leadership in a clear and concise manner. Support leadership in data-driven decision-making and strategic planning for operational improvements. Education Bachelor’s degree in Data Science, Statistics, Computer Science, Operations Research, or a related field; Master’s degree preferred. Experience 10+ years of experience in data analysis, with at least 4+ years in a SaaS or technology environment. Experience working in SaaS or technology-focused companies is highly desirable. Proficiency in SQL for querying databases and working with large datasets. Experience with data analysis tools such as Python, R, or Excel. Expertise in data visualization platforms such as Tableau, Looker, or Power BI. Familiarity with SaaS business models and metrics (MRR, ARR, CAC, LTV, churn). Experience working with SaaS tools such as Salesforce, HubSpot, Zendesk, or similar. Knowledge Skill Ability Certifications in data analysis, such as Google Data Analytics Professional, or experience with Lean/Six Sigma methodologies is a plus Knowledge of statistical analysis and predictive modeling techniques. Strong analytical and problem-solving skills with the ability to work with complex datasets. Excellent communication skills, with the ability to translate data insights into business recommendations. Detail-oriented with strong organizational skills. Ability to collaborate effectively across departments and work in a fast-paced environment. Self-motivated and proactive in driving data-driven improvements. Note: Candidates should be willing to work in US shift

Posted 4 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Who We Are We're a DevOps and Automation company based in Bengaluru, India. We have successfully delivered over 170 automation projects for 65+ global businesses, including Fortune 500 companies that entrust us with their most critical infrastructure and operations. We're bootstrapped, profitable, and scaling rapidly by consistently solving real, impactful problems. What We Value Ownership: As part of our team, you're responsible for strategy and outcomes, not just completing assigned tasks. High Velocity: We move fast, iterate faster, and amplify our impact, always prioritizing quality over speed. Who We Seek We are seeking a NextJS Developer with 6 months of experience to join our Engineering team and help develop and maintain our internal products. Your ultimate goal will be to build highly responsive and innovative AI based software solutions that meet our business needs. We're looking for individuals who genuinely care, ship fast, and are driven to make a significant impact. Job Location: Bengaluru (Work From Office) What You Will Be Doing Design, develop, and maintain robust and scalable web applications using NextJS, ReactJS, and TypeScript. Design and implement efficient database schemas using PostgreSQL. Collaborate closely with Front-end developers, Product Managers, Test Engineers, and other stakeholders to understand requirements and transform them into effective technical solutions. Identify and resolve application performance bottlenecks, implementing solutions that enhance responsiveness and scalability. Participate in code reviews to maintain code quality and ensure adherence to architectural guidelines and maintainability standards. Maintain comprehensive technical documentation for all developed features, APIs, and system configurations. Utilize GitHub for version control, ensuring proper branching, merging, code management and CI/CD pipelines. Stay updated with the latest industry trends, and AI advancements. Integrate with REST APIs and Supabase to build complete full-stack solutions. Handle real-time error debugging and implement robust error-handling mechanisms. Use platforms like Vercel for efficient deployments and edge optimization. Collaborate with backend developers (Python/Django) to ensure seamless integration and data flow. Leverage AI tools like Cursor, Claude Code, and strong AI prompting skills to improve development efficiency. Continuously monitor performance and make optimization improvements across the stack. Participate in code reviews and contribute to documentation and shared best practices. What We’re Looking For 6 months of hands-on experience building and deploying applications using NextJS, ReactJS, TypeScript and NodeJS. Proficient in Tailwind CSS, PostgreSQL and modern front-end architecture. Solid understanding of REST APIs, API integration, and state management. Familiarity with front-end technologies (Tailwind CSS, NextJS, shadcn/ui) and ability to develop consistent and user-friendly web interfaces based on Figma designs. Expertise in PostgreSQL, including advanced querying and performance optimization. Deep understanding of web application architecture and system design. Excellent algorithmic problem-solving abilities, focussing on efficiency and performance. Exposure to platforms like Vercel for hosting and deployment. Proficient with GitHub and collaborative development workflows. Strong understanding of how to construct effective prompts to guide AI tools, ensuring precise, relevant, and optimized code output. Exposure to AI-Powered coding tools and services such as Cursor and Claude Code. Ability to excel in highly collaborative and dynamic, fast-paced environments. Quick to learn and adapt to evolving technologies and methodologies. Exceptional verbal and written English communication skills are a must. Benefits Work directly with founders and the leadership team. Drive projects that create real business impact — not busywork. Gain practical skills that traditional education misses. Experience rapid growth as you tackle meaningful challenges. Fuel your career journey with continuous learning and advancement paths. Thrive in a workplace where collaboration powers innovation daily. For any queries, please contact careers@vegastack.com . Job Types: Full-time, Permanent Schedule: Day shift Monday to Friday Ability to commute/relocate: Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Work Location: In person

Posted 4 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role We are looking for a Power BI Developer with over 2 years of hands-on experience in building dashboards, reports, and data models. The candidate should have solid knowledge of DAX, Power Query (M), and SQL, and be capable of delivering insightful and interactive business reports based on requirements from business stakeholders. Job Location – Hyderabad Key Responsibilities Develop interactive Power BI dashboards and reports based on business needs. Create and manage data models using Power BI Desktop and Power BI Service. Write DAX measures and calculated columns to support reporting requirements. Perform data transformation and shaping using Power Query (M). Connect to various data sources such as Excel, SQL Server, and cloud platforms. Work with business analysts or product owners to understand reporting requirements. Ensure report performance and usability across devices (responsive design). Publish reports to Power BI Service and schedule data refreshes. Assist with implementing row-level security (RLS) and access control. What You’ll Bring: Bachelor’s or master’s degree in engineering, Computer Applications, or a related field. Proven experience delivering high-quality dashboards. Mandatory Skills 2–3 years of experience with Power BI development. Strong understanding of DAX and Power Query (M). Proficiency in SQL for querying and validating data. Experience in creating interactive visuals, slicers, and drill-through reports. Ability to understand data relationships and design appropriate data models. Nice to Have: Exposure to Power BI Service administration. Experience with Power Platform (Power Apps, Power Automate). Knowledge of Azure Data Factory or basic ETL pipelines. Familiarity with Agile/Scrum development process.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Big Data Analytics & Engineering Overview Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 4 days ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Over 12 years of extensive experience in AI/ML , with a proven track record of architecting and delivering enterprise-scale machine learning solutions across the Retail and FMCG domains . Demonstrated ability to align AI strategy with business outcomes in areas such as customer experience, dynamic pricing, demand forecasting, assortment planning, and inventory optimization. Deep expertise in Large Language Models (LLMs) and Generative AI , including OpenAI’s GPT family , ChatGPT , and emerging models like DeepSeek . Adept at designing domain-specific use cases such as intelligent product search, contextual recommendation engines, conversational commerce assistants, and automated customer engagement using Retrieval-Augmented Generation (RAG) pipelines. Strong hands-on experience developing and deploying advanced ML models using modern data science stacks including: Python (advanced programming with focus on clean, scalable codebases) TensorFlow and Scikit-learn (for deep learning and classical ML models) NumPy , Pandas (for data wrangling, transformation, and statistical analysis) SQL (for structured data querying, feature engineering, and pipeline optimization) Expert-level understanding of Deep Learning architectures (CNNs, RNNs, Transformers, BERT/GPT), and Natural Language Processing (NLP) techniques such as entity recognition, text summarization, semantic search, and topic modeling – with practical application in retail-focused scenarios like product catalog enrichment, personalized marketing, and voice/text-based customer interactions. Strong data engineering proficiency , with experience designing robust data pipelines, building scalable ETL workflows, and integrating structured and unstructured data from ERP, CRM, POS, and social media platforms. Proven ability to operationalize ML workflows through automated retraining, version control, and model monitoring. Significant experience deploying AI/ML solutions at scale on cloud platforms such as AWS (SageMaker, Bedrock) , Google Cloud Platform (Vertex AI) , and Azure Machine Learning . Skilled in designing cloud-native architectures for low-latency inference, high-volume batch scoring, and streaming analytics. Familiar with containerization (Docker), orchestration (Kubernetes), and CI/CD for ML (MLOps). Ability to lead cross-functional teams , translating technical concepts into business impact, and collaborating with marketing, supply chain, merchandising, and IT stakeholders. Comfortable engaging with executive leadership to influence digital and AI strategies at an enterprise level.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

4.0 years

0 Lacs

India

Remote

Job Title : Data Research Engineer Location: Remote (Hybrid for Chennai& Mumbai) Experience: 4 Years Responsibilities: ● Develop methods to leverage the potential of LLM and AI within the team. ● Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. ● Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. ● Assist in acquiring and integrating data from various sources, including web crawling and API integration. ● Stay updated with emerging technologies and industry trends. ● Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. ● Contribute to cross-functional teams in understanding data requirements. ● Assume accountability for achieving development milestones. ● Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. ● Collaborate with and assist fellow members of the Data Research Engineering Team as required. ● Leverage online resources effectively like Stack Overflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills and Experience ● Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. ● Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. ● “Think outside the box” mentality. ● Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. ● Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) ● Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) ● Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. ● Strong proficiency in Python programming ● Proficiency in SQL and data querying is a plus. ● Familiarity with web crawling techniques and API integration is a plus but not a must. ● Experience in AI/ML engineering and data extraction ● Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) ● Strong understanding of machine learning frameworks (Tensor Flow, PyTorch) ● Design and build AI models using LLMs ● Integrate LLM solutions with existing systems via APIs ● Collaborate with the team to implement and optimize AI solutions ● Monitor and improve model performance and accuracy ● Familiarity with Agile development methodologies is a plus. ● Strong problem-solving and analytical skills with attention to detail. ● Creative and critical thinking. ● Ability to work collaboratively in a team environment. ● Good and effective communication skills. ● Experience with version control systems, such as Git, for collaborative development. ● Ability to thrive in a fast-paced environment with rapidly changing priorities. ● Comfortable with autonomy and ability to work independently. Notice period: Immediate to 30 days Email to: poniswarya.m@aptita.com

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies