Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Software Engineer, Chennai Our NielsenIQ Technology teams are working on revamping multiple platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on NielsenIQ’s data and insights to innovate and grow. As a Python Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and prototyping cutting-edge technologies. As a market research company, we had lot of data analytics, machine learning requirements being implemented in various applications. Our CDAR Platform had various usecase which incorporated AI/ML and Data analytics. Our team is co-located and agile, with central technology hubs in Chicago, Madrid, Toronto, Chennai and Pune Responsibilities Understanding user needs and how they fit into the overall, global solution design Prototyping new features and integrations aligned to business strategy by introducing innovation through technology Following source & test-driven development best practices Troubleshooting and identifying root cause analysis while resolving the issues Write complex, maintainable code to develop scalable, flexible, and user-friendly applications. Importing/Collecting, cleaning, converting and analyzing the data for the purpose of finding insights and making conclusions. Train models, fine tune parameters for maximum efficiency and deploy models. Actively participate in building algorithms for solving complex problems with design and development. Take ownership of the projects and ensure timely deliveries. Collaborate with diverse teams across time zones. Qualifications Minimum of 2 years of experience in large scale production systems and in languages such as Python/R Minimum B.S. degree in Computer Science, Computer Engineering or related field with focus in machine learning Strong software engineering skills and understanding of the ML lifecycle with a minimum of 2 years' experience in ML production systems and in software development Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas Fluent in processing data with pandas (e.g., querying, transforming, joining, cleaning, etc.) including experience debugging logic and performance issues Strong understanding of machine learning algorithms with experience writing, debugging, and optimizing ML data structures, pipelines, and transformations Knowledge of statistics, probability, or a related discipline Extensive data modelling and data architecture skills Strong knowledge of version control tools, preferably Bit bucket Basic Knowledge on Linux/Unix environment (basic commands, shell scripting, etc.) Demonstrated ability to work as part of a Global Team Strong troubleshooting and problem-solving skills Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Preferred Qualifications: Bachelor’s degree or equivalent in Computer Science or a related field with a focus in machine learning Experience using Collaboration Technologies: Azure DevOps, TFS, Jira, Confluence Experience using Atlassian tool suite, including JIRA, Confluence, BitBucket Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About Mytholog Innovations At Mytholog Innovations, we turn visionary ideas into robust digital realities. Leveraging deep expertise in backend development, microservices, cloud-native architectures, and DevOps, we partner with clients to design scalable systems, enhance existing infrastructures, and deliver impactful engineering solutions. Our rigorous talent screening and commitment to excellence empower companies to build high-performing tech teams that drive sustained innovation and business growth. Job Description We’re looking for a seasoned Senior Full Stack Developer with strong proficiency in React, React Native, Java Spring Boot, and cloud-native databases such as Supabase and MongoDB. In this role, you will architect and deliver secure, scalable, and high-performance solutions tailored to meet business goals and technical needs. If you are passionate about writing clean, maintainable code and excel in agile environments, this opportunity offers both challenge and reward. Location: Remote Employment Type: Full-Time (Contract) Experience: Minimum 5 years Probation: 15 days Compensation: ₹70,000 – ₹95,000 per month Note: This is a full-time contractor role. It does not include traditional employee benefits (insurance, PF, etc.). Standard TDS will be deducted from payments, and tax filing is the contractor’s responsibility. Key Responsibilities Collaborate closely with client stakeholders to gather requirements, provide technical insights, and deliver end-to-end full stack solutions. Build intuitive, responsive front-end interfaces using React (web) and React Native (mobile). Develop scalable backend services with Java Spring Boot, including REST APIs, authentication, business logic, and integrations. Implement data storage and querying strategies using Supabase and MongoDB, optimizing for performance and reliability. Engage in the full software lifecycle: design, development, testing, deployment, and maintenance. Ensure cross-browser and cross-platform compatibility and responsiveness. Apply best practices for code quality, security, performance, and testing. Troubleshoot and resolve complex full stack technical issues. Participate in architectural planning and collaborate with client technical teams. Requirements 5+ years hands-on experience in full stack development. Expertise in React and React Native, including component design, state management (Redux/Context API), and navigation. Advanced skills in Java Spring Boot for backend services, microservices, and APIs. Proven experience with Supabase, MongoDB, or similar cloud-native NoSQL databases. Strong understanding of REST API design, secure authentication (OAuth2, JWT), and integration patterns. Familiarity with CI/CD, Git workflows, and containerization (Docker). Experience with cloud platforms (AWS, GCP, Azure) is a plus. Flexibility to work aligned with client time zones. Proactive problem-solving and self-driven development approach. Excellent communication skills to collaborate professionally with client teams. Performance Evaluation Plan Days 1–15: Probation & Onboarding Set up full-stack application environments and submit your first clean, maintainable pull request. Deliver a concise technical assessment of the existing codebase. Demonstrate velocity, ownership, and effective communication. Days 16–30: Deep Integration & Impact Deliver a complete user story to production. Resolve an integration challenge (e.g., React/React Native ↔ backend API). Maintain 90%+ test coverage and adhere to quality standards. Days 31–45: Optimization & Ownership Release a high-impact, production-ready feature. Address critical performance bottlenecks (Spring Boot, Supabase, MongoDB). Establish yourself as a trusted, proactive technical partner. Benefits 🏡 Fully Remote — Work from your preferred location 🌍 Global Exposure — Collaborate with fast-moving startups worldwide 🤝 Supportive Culture — Transparent, collaborative, and growth-oriented team 🎓 Certification Support — Timely reimbursement programs to boost your credentials 🚀 Performance-Focused Growth — Advancement based on impact, not tenure Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings! One of our esteemed client Japanese multinational information technology (IT) service and consulting company headquartered in Tokyo, Japan. The company acquired Italy -based Value Team S.p.A. and launched Global One Teams. Join this dynamic, high-impact firm where innovation meets opportunity — and take your career to new height s! 🔍 We Are Hiring: Python, PySpark and SQL Developer (8-12 years) Relevant Exp – 8-12 Years JD - • Python, PySpark and SQL • 8+ years of experience in Spark, Scala, PySpark for big data processing • Proficiency in Python programming for data manipulation and analysis. • Experience with Python libraries such as Pandas, NumPy. • Knowledge of Spark architecture and components (RDDs, DataFrames, Spark SQL). • Strong knowledge of SQL for querying databases. • Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. • Ability to write complex SQL queries for data extraction and transformation. • Strong analytical skills to interpret data and provide insights. • Ability to troubleshoot and resolve data-related issues. • Strong problem-solving skills to address data-related challenges • Effective communication skills to collaborate with cross-functional teams. Role/Responsibilities: • Work on development activities along with lead activities • Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently • Collaborate with other teams to understand data requirements and deliver solutions. • Design, develop, and maintain scalable data pipelines using Python and PySpark. • Utilize PySpark and Spark scripting for data processing and analysis • Implement ETL (Extract, Transform, Load) processes to ensure data is accurately processed and stored. • Develop and maintain Power BI reports and dashboards. • Optimize data pipelines for performance and reliability. • Integrate data from various sources into centralized data repositories. • Ensure data quality and consistency across different data sets. • Analyze large data sets to identify trends, patterns, and insights. • Optimize PySpark applications for better performance and scalability. • Continuously improve data processing workflows and infrastructure. Interested candidates, please share your updated resume along with the following details : Total Experience: Relevant Experience in Python, PySpark and SQL: Current Loc Current CTC: Expected CTC: Notice Period: 🔒 We assure you that your profile will be handled with strict confidentiality. 📩 Apply now and be part of this incredible journey Thanks, Syed Mohammad!! syed.m@anlage.co.in Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Associate Job Description & Summary A career within SAP Consulting services, will provide you with the opportunity to help our clients maximise the value of their SAP investment with offerings that address sales, finance, supply chain, engineering, and human capital. We provide comprehensive consulting, system integration and implementation services across multiple SAP applications, products and technologies. Simply put, we focus on delivering business led, technology enabled change for our clients including industry specific enterprise resource planning and the latest in mobile, analytics and cloud solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within SAP Consulting services, will provide you with the opportunity to help our clients maximize the value of their SAP investment with offerings that address sales, finance, supply chain, engineering, and human capital. We provide comprehensive consulting, system integration and implementation services across multiple SAP applications, products and technologies. Simply put, we focus on delivering business led, technology enabled change for our clients including industry specific enterprise resource planning and the latest in mobile, analytics and cloud solutions. Requirements: Key Skills: ETL tools (BODS, Syniti, SNP Crystal Bridge), Information Steward, LTMC, Data Migration concepts, SAP S/4HANA, Oracle, MS SQL, R ,Python Job Description: We are seeking ETL Specialists with specialized experience in managing and executing data migration projects within SAP S/4HANA environments. The ideal candidate will demonstrate hands-on expertise in data mapping, transformation, and loading processes, specifically between SAP ECC/other legacy systems and SAP S/4HANA systems, coupled with a strong grasp of data migration concepts. Key Responsibilities: Minimum2+ years of experience with at least two end-to- end Data Migration Implementations within SAP S/4HANA environments. In-depth understanding of ETL architecture and methodologies, with a strong focus on SAP S/4HANA data migration. Proficient in designing and implementing Data Migration solutions using ETL tools and best practices tailored for SAP S/4HANA. Expertise in data extraction, transformation, and loading processes between SAP ECC/other legacy systems and SAP S/4HANA, leveraging SQL and other relevant technologies. Strong SQL skills for data manipulation, querying, and performance tuning within SAP environments. Experience in data profiling, validation, and reconciliation during SAP S/4HANA migration projects. Familiarity with Data Quality and Data Governance principles as applied to SAP data migration. Exceptional problem-solving and analytical abilities with a proactive approach to issue resolution. Excellent communication skills with the ability to collaborate effectively with clients and internal teams. Mandatory Skill Sets: ETL tools (BODS, Syniti, SNP Crystal Bridge), Information Steward, LTMC, Data Migration concepts, SAP S/4HANA, Oracle, MS SQL,R ,Python Preferred Skill Sets: Certification(s) in relevant ETL tools (e.g., SAP Data Services, Syniti, SNP Cry Years of experience required: 2-4years Education Qualification: BE/B.Tech/MBA/MCA Bachelor’s degree in computer science, Information Technology, or related field. Minimum 60% throughout career without any gaps and not in distant mode of learning. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor Degree, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills SAP ABAP (Advanced Business Application Programming) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Audit Documentation, Auditing, Business Administration, Communication, Compliance Advisement, Compliance and Governance, Compliance Auditing, Compliance Awareness, Compliance Oversight, Compliance Program Implementation, Compliance Record Keeping, Compliance Review, Compliance Technology, Corporate Governance, Corrective Actions, Data Analytics, Developing Policies and Guidelines, Emotional Regulation, Empathy, Governance Framework, Inclusion, Intellectual Curiosity {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description TOC (Transportation Operation Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, TOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving TOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, TOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon’s ability to serve its customers on time. Purview of a Trans Ops Specialist A Trans Ops Specialist at TOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at TOC works across two verticals – Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Specialist provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Trans Ops Specialist should be able to ideate process improvements and should have the zeal to drive them to conclusion. Key job responsibilities Responsibilities Include, But Are Not Limited To Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Ability to pull data from numerous databases (using Excel, Access, SQL and/or other data management systems) and to perform ad hoc reporting and analysis as needed is a plus. Develop and/or understand performance metrics to assist with driving business results. Ability to scope out business and functional requirements for the Amazon technology teams who create and enhance the software systems and tools are used by TOC. Must be able to quickly understand the business impact of the trends and make decisions that make sense based on available data. Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals. Providing real-time customer experience by working in 24*7 operating environment. Basic Qualifications Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics Experience with Excel Preferred Qualifications Experience with SQL Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2985239 Show more Show less
Posted 1 week ago
3.0 years
4 - 16 Lacs
Gurgaon
On-site
Job Title: ETL+ SQL Developer Skills: ETL+ SQL · Experience with SQL and data querying languages. · Knowledge of data governance frameworks and best practices. · Familiarity with programming/scripting languages (e.g., SparkSQL) · Strong understanding of data integration techniques and ETL processes. · Experience with data quality tools and methodologies. · Strong communication and problem-solving skills Detailed JD: Data Integration: Manage the seamless integration various data lake, ensuring that jobs are running as expected, validate the data ingested , track the DQ checks , rerun/reprocess the jobs in case of failures post figuring out the RCAs Data Quality Assurance: Monitor and validate data quality during and after the migration process, implementing checks and corrective actions as needed. Documentation: Maintain comprehensive documentation related to data issues encountered during the weekly/monthly processing and operational procedures. Continuous Improvement: Recommend and implement improvements to data processing, tools, and technologies to enhance efficiency and effectiveness. Experience: 4-6 yrs Job Types: Full-time, Permanent Pay: ₹422,962.77 - ₹1,635,383.54 per year Benefits: Health insurance Paid sick time Provident Fund Schedule: Day shift Ability to commute/relocate: Gurgaon, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current Annual CTC in INR Lacs? What is your notice period in terms of days? Experience: total work: 3 years (Preferred) Work Location: In person
Posted 1 week ago
5.0 years
3 - 9 Lacs
Mumbai
On-site
JOB DESCRIPTION Join our dynamic team as a Sr. Lead Software Engineer, where you will have the opportunity to solve complex problems and contribute to our innovative projects. With us, you can enhance your skills in Python, PySpark, and cloud architecture, while working in an inclusive and respectful team environment. This role offers immense growth potential and a chance to work with cutting-edge technologies. As a Sr. Lead Software Engineer- Python / Spark Big Data at JPMorgan Chase within the Capital Reporting product, you will be executing software solutions, designing, developing, and troubleshooting technical issues. We value diversity, equity, inclusion, and respect in our team culture. This role provides an opportunity to contribute to software engineering communities of practice and events that explore new and emerging technologies. You will have the chance to proactively identify hidden problems and patterns in data and use these insights to promote improvements to coding hygiene and system architecture. Job responsibilities: Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills: Formal training or certification on Python or PySpark concepts and 5+ years applied experience Demonstrated knowledge of software applications and technical processes within a cloud or microservices architecture. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) Experience with Big Data solutions or Relational DB. Experience in Financial Service Industry is nice to have. ABOUT US
Posted 1 week ago
0 years
5 - 7 Lacs
Noida
On-site
We are seeking a proactive and results-driven Lead – Data Ops to drive operational efficency, support cross-functional alignment, and ensure transparent communication of key metrics across the organization. The ideal candidate will be responsible for adopting new product features, ensuring support needs are translated into actionable product improvements, and developing performance indicators to track the health and efficency of operations. Proficiency in SQL, Power BI & Excel-based reporting is essential. Requirements Collaborate with the Product team to adopt newly released efficiencies aimed at improving unit economics and process optimization. Liaise with the Platform team to document and share support requirements. Ensure they are incorporated in Product Requirement Documents (PRDs) and prioritized based on business impact. Own the end-to-end communication and reporting of OKRs and other key operational metrics to leadership, ensuring transparency and accountability. Track progress and follow up with Data Ops Managers to ensure timely updates, alignment, and delivery of key objectives. Drive a culture of ownership and collaboration across teams to ensure commitments are delivered with quality and within timelines. Define and develop KPIs to measure the efficiency and effectiveness of operational programs, and use data insights to inform improvements. Identify opportunities for improving operational processes and implement incremental changes to enhance performance tracking and team efficiency Utilize SQL for querying and analyzing operational datasets; proficiency with Power BI for creating dashboards and visual reports. Required Skills Good SQL , Power BI & Excel knowledge An ambitious person who can work in a flexible startup environment with only one thing in mind - getting things done. Excellent written and verbal communication skills. Demonstrated ability to work with cross-functional teams in a fast-paced, dynamic environment. Strong organizational, and stakeholder management skills. Ability to translate business requirements into actionable operational plans Benefits Industry-Focused Certifications: Meet leading healthcare experts, discuss innovative strategies, and become a subject matter expert with our comprehensive set of certifications. Rewards and Recognition: Feeling like you’re outperforming on your projects? Get recognition for your dedicated efforts and demonstrated work ethic. Health Insurance and Mental Well-being: We offer health benefits and insurance to you and your family for hospital-related expenses pertaining to any illness, disease, or injury. We also have Employee Assistance Programs (EAPs) to give you 24X7 access to certified therapists and psychologists. Sabbatical Leave Policy: Do you want to focus on skill development, pursue an academic career, or just reset? We’ve got you covered. Open Floor Plan: Cubicles are a thing of the past and to modernize our office space, we have open floor sittings at every office location. Share ideas with your peers and bond better in an open floor office where there are no barriers and you are inspired to be creative. Paternity and Maternity Leave: Enjoy the industry’s best parental leave policy to welcome your bundle of joy and enjoy quality time with them. Paid Time Off: Maintain a healthy work–life balance and take time off from work to focus on your well-being and big life moments.
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Job Description for L2 associates Overview The L2 associate acts as the primary interface between Amazon and our delivery partners, so to our delivery partners- you ARE Amazon. L2 Associates are expected to identify DP concerns and work on troubleshooting delivery partner issues and provide process improvement suggestions. Summary Of Responsibilities Effectively communicate in a clear and professional manner at all times Provides/ expedites prompt and efficient service to Amazon customers/ delivery partners Effectively manage sensitive cases by reporting up the escalation matrix Demonstrate excellent time-management skills Maintains or exceeds targeted performance metrics Actively seek solutions through logical reasoning and identify trends to suggest process improvements Key job responsibilities A Transportation Representative provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and verbal form. Basic Qualifications Qualitative Requirements Ø Graduation in any specialization from a recognized university. Ø Excellent communication skills (written and verbal) in English language. Ø Ability to communicate correctly and clearly with all customers Ø Good comprehension skills – ability to clearly understand and state the issues customers present Ø Ability to concentrate – follow customers issues without distraction to resolution Ø Work successfully in a team environment as well as independently Ø Familiarity with Windows XP, Microsoft Outlook, Microsoft Word and Internet Explorer Ø Excellent typing skills Ø Demonstrates an ability to successfully navigate websites Ø Demonstrates a proficient knowledge of email applications Preferred Qualifications Logistics background and Experience in similar role Proficient in Excel Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Telangana Job ID: A2985240 Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
Remote
What You Can Expect The Marketing Data Operations Manager bridges marketing goals with technical data capabilities. You’ll define and implement audience segmentation, campaign targeting, and data structures using marketing automation tools (Pardot, Marketo, Eloqua), SFDC CRM, advanced SQL, and Tableau. This role demands both strategic insight and hands-on technical execution, enabling scalable, personalized marketing initiatives. About The Team The Marketing Data Operations Team covers the management and maintenance of the infrastructure that provides the Marketing Database for the organization. This also involves the processes and procedures used to collect, ingest, process, store, analyse and distribute data effectively and efficiently for Enterprise Marketing. What We’re Looking For Have a Bachelor's in computer science, data science, marketing technology, or a related discipline. 4+ years of experience working with B2B/B2C marketing data across CRM systems, automation tools, and data environments. Have experience with Pardot, Marketo, or Eloqua for building and executing targeted marketing campaigns. Demonstrate proficiency in SFDC (Salesforce CRM), including data model understanding and campaign integration. Have advanced SQL skills for querying, data transformation, and audience building in complex datasets. Have skills in Tableau or similar tools for data visualization and storytelling with marketing performance data. Have advanced Excel proficiency including pivot tables, VLOOKUPs, formulas, and data manipulation techniques. Have excellent stakeholder management, communication, and presentation skills to bridge marketing, analytics, and technical teams. Ways of Working Our structured hybrid approach is centered around our offices and remote work environments. The work style of each role, Hybrid, Remote, or In-Person is indicated in the job description/posting. Benefits As part of our award-winning workplace culture and commitment to delivering happiness, our benefits program offers a variety of perks, benefits, and options to help employees maintain their physical, mental, emotional, and financial health; support work-life balance; and contribute to their community in meaningful ways. Click Learn for more information. About Us Zoomies help people stay connected so they can get more done together. We set out to build the best collaboration platform for the enterprise, and today help people communicate better with products like Zoom Contact Center, Zoom Phone, Zoom Events, Zoom Apps, Zoom Rooms, and Zoom Webinars. We’re problem-solvers, working at a fast pace to design solutions with our customers and users in mind. Find room to grow with opportunities to stretch your skills and advance your career in a collaborative, growth-focused environment. Our Commitment At Zoom, we believe great work happens when people feel supported and empowered. We’re committed to fair hiring practices that ensure every candidate is evaluated based on skills, experience, and potential. If you require an accommodation during the hiring process, let us know—we’re here to support you at every step. If you need assistance navigating the interview process due to a medical disability, please submit an Accommodations Request Form and someone from our team will reach out soon. This form is solely for applicants who require an accommodation due to a qualifying medical disability. Non-accommodation-related requests, such as application follow-ups or technical issues, will not be addressed. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Come be a part of a rapidly expanding $35 billion-dollar global business. At Amazon Business, a fast-growing startup passionate about building solutions, we set out every day to innovate and disrupt the status quo. We stand at the intersection of tech & retail in the B2B space developing innovative purchasing and procurement solutions to help businesses and organizations thrive. At Amazon Business, we strive to be the most recognized and preferred strategic partner for smart business buying. Bring your insight, imagination and a healthy disregard for the impossible. Join us in building and celebrating the value of Amazon Business to buyers and sellers of all sizes and industries. Unlock your career potential. Key job responsibilities Own generating actionable insights through the development of metrics and dashboards. Analyze relevant business information, and uncover trends and correlations to develop insights that can materially improve our product and strategy decisions. Provide insights to our Canada business and Product Management teams as new initiatives are being identified, prioritized, implemented and deployed. Develop clear communications for recommended actions. Establish new, scalable, efficient, automated processes for tracking and reporting on progress of initiatives. Work to accelerate Amazon Business machine learning efforts to scale globally. About The Team We are central Amazon Business Marketing Analytics (ABMA) team for WW AB Marketing team. Our vision is to simplify and accelerate data driven decision making for AB Marketing by providing cost effective, easy & timely access to high quality data. Our core responsibilities towards AB marketing includes a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dashboarding d) empowering business with self-service tools for deep dives & insights seeking. The value construct of these capabilities is to enhance the speed of business decision-making by reducing the effort & time to access actionable data. Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad - A85 Job ID: A2985257 Show more Show less
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Overview JB Poindexter (India) Private Limited is a subsidiary of J.B. Poindexter & Co., Inc. a privately held diversified manufacturing company forecasting $2.4B in annual revenue and 8,000 team members in 2024 . The eight operating subsidiaries, covering over 50 locations, are engaged in the production of commercial truck bodies, step-vans, utility trucks, funeral coaches, limousines, pickup truck bed enclosures, precision machining, and expandable foam plastic packaging. For more information, visit www.jbpoindexter.com JB Poindexter (India) Private Limited is the captive shared services unit of the J.B. Poindexter & Co., Inc. The company, wholly owned by J. B. Poindexter & Co., Inc. & is headquartered in Houston, Texas, USA. Position Overview This position will be responsible for all technical aspects of Power BI report development. Either as an individual contributor or as a team player supporting Power BI lead in the creation and ongoing refinement of reports for various departments across business units. Responsibilities Power BI Report development. Building Analysis Services reporting models. Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. Connecting data sources, importing data, and transforming data for Business intelligence. Analytical thinking for translating data into informative reports and visuals. Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI. Should have an edge over making DAX queries in Power BI desktop. Expert in using advanced-level calculations on the data set. Responsible for design methodology and project documentaries. Should be able to develop tabular and multidimensional models that are compatible with data warehouse standards. Very good communication skills must be able to discuss the requirements effectively with the client teams, and with internal teams. Requirements Should have 12-16 year of experience with BI tools and systems such as Power BI, Tableau, and ERP (JD Edwards/ SAP etc.), experience with JDE will be preferred. Knowledge in Microsoft BI Stack Grip over data analytics Should possess software development skills Data Source Connectivity: Knowledge of connecting to various data sources, such as databases, cloud services, and APIs Power Query: Ability to use Power Query for data extraction and transformation SQL: Familiarity with SQL for querying databases. Nice to have Microsoft Certified: Data Analyst Associate: This certification covers Power BI in detail and demonstrates your proficiency in data analysis using Power BI. Nice to have Microsoft Certified: Power BI Certified: This certification specifically focuses on Power BI and certifies your skills in data modelling, visualizations, and DAX. Code Of Ethics JB Poindexter (India) Private Limited , requires the highest standard of ethics in all business dealings, with customers, suppliers, advisors, employees, and authorities. This position shall actively ensure that his/her own activities and those of all employees within the project meet this obligation. JBPCO critical standards and procedures related to expected conduct are detailed on the company website. This position is expected to be familiar with these policies and ensure that they are implemented in all areas of control. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We're looking for a Principal Software Engineer This role is Office Based, Hyderabad Office We are looking for a Principal Software Engineer for our Product engineering team In this role, you will… Develop, maintain and enhance .NET applications and services to contribute to our legacy and cloud platform. Analyze product and technical user stories and convey technical specifications in a concise and effective manner. Code & deliver a working product, with a ‘first time right’ approach. Participate in release planning, sprint planning, and technical design reviews; provide input as appropriate. Partner with engineers, product managers, and other team members as appropriate and be the go-to person for technical matters. Develop and maintain thorough knowledge and understanding of products. Leading projects as necessary, increasing team productivity and effectiveness by sharing your deep knowledge and experience. Drive key architectural decisions and design considerations. Partner with other Architect’s and Manager to come up with setting technical guidelines and participate in code reviews to mentor other engineers on best practices. Partner with Product to do early feasibility of technical architecture. Partner with other Architects to build necessary frameworks to improve productivity of the engineers by driving automation. Introduce newer technologies as needed along with a strong POC and build a strong use case for more adoptability. Troubleshoot complex production issues and provide detailed RCA. Participate in agile activities like sprint planning, and technical design reviews; provide input as appropriate AI-Driven Software Architecture: Design, develop, and implement scalable, maintainable, and high-performance AI-powered software systems. Integrate AI models and algorithms into software applications to deliver intelligent solutions. You’ve Got What It Takes If You Have… Bachelor’s or master’s degree in computer science or related field. 8+ years of experience with active hands-on development experience in Microsoft Technology stack using C# Strong experience developing Microservices, RESTful services (preferably AWS) Experience with AWS core services: Lambda, ECS (Elastic Container Service), SNS (Simple Notification Service), SQS (Simple Queue Service), DynamoDB. Expertise in CloudWatch (monitoring and logging), cost management tools, and IAM for managing user permissions and security protocols. Knowledge of Elasticsearch and querying logs in Splunk. Exposure to ORM’s like Entity Framework, Nhibernate or similar. Strong TDD approach and hands-on experience on tools like Nunit, xUnit or any other testing tools or frameworks and CICD practices. Strong in OOP and SOLID design principles. Understand AWS core services and basic architecture best practices. Experience in working on projects with public cloud providers like Amazon Web Services, Azure, Google Cloud, etc. Highly efficient data persistent design techniques. Strong understanding of data retrieval performance (queries, caching). Able to optimize designs/queries for scale. Proficient experience with relational databases such as Microsoft SQL Server/Postgres. Exposure to other non-relational DBs like MongoDB is a plus! Good understanding on how to deal with concurrency and parallel work streams. Should have work experience with Agile practices. Should be very good at analyzing and Debugging/Troubleshooting functional and technical issues. Should have good insight on Performance/Optimization techniques. Good understanding on secure development practices and proactively codes to avoid security issues. Able to resolve all findings. Excellent analytical, quantitative and problem-solving abilities Conversant in algorithms, software design patterns and Microservices, and their best usage. Ability to build frameworks and POC’s from scratch that can be used across the teams. Self-motivated, requiring minimal oversight. Good team player with the ability to handle multiple concurrent priorities in a fast-paced environment. Strong interpersonal, written, and oral communication skills. Passion for continuous process and technology improvement AWS experience must and Certification preferable. Our Culture Spark Greatness. Shatter Boundaries. Share Success. Are you ready? Because here, right now – is where the future of work is happening. Where curious disruptors and change innovators like you are helping communities and customers enable everyone – anywhere – to learn, grow and advance. To be better tomorrow than they are today. Who We Are Cornerstone powers the potential of organizations and their people to thrive in a changing world. Cornerstone Galaxy, the complete AI-powered workforce agility platform, meets organizations where they are. With Galaxy, organizations can identify skills gaps and development opportunities, retain and engage top talent, and provide multimodal learning experiences to meet the diverse needs of the modern workforce. More than 7,000 organizations and 100 million+ users in 180+ countries and in nearly 50 languages use Cornerstone Galaxy to build high-performing, future-ready organizations and people today. Check us out on LinkedIn , Comparably , Glassdoor , and Facebook ! Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Data Engineer RESPONSIBILITIES/TASKS: Work with Technical Leads and Architects to analyse solutions. Translate complex business requirements into tangible data requirements through collaborative work with both business and technical subject matter experts. Develop / modify data models with an eye towards high performance, scalability, flexibility, and usability. Ensure data models are in alignment with the overall architecture standards. Create source to target mapping documentation. Subject matter knowledge guidance in source system analysis and ETL build. Responsible for overseeing data integrity in the Smart Conductor system Serves as data flow and enrichment “owner” with deep expertise in data dynamics, capable of recognizing and elevating improvement opportunities early in the process Work with product owners to understand business reporting requirements and deliver appropriate insights on regular basis Responsible for system configuration to deliver reports, data visualizations, and other solution components SKILLS REQUIRED: More than 5 years of software development experience Proficient in Power BI/Tableau, Google data Studio, R, SQL, Python Strong knowledge of cloud computing and experience in Microsoft Azure – Azure ML Studio, Azure Machine Learning Strong knowledge in SSIS Proficient in Azure services - Azure Data Factory, Synapse, Data Lake Experience querying, analysing, or managing data required! Experience within the healthcare insurance industry with payer data strongly preferred Experience in data cleansing, data engineering, data enrichment, data warehousing/ Business Intelligence preferred. Strong analytical, problem solving and planning skills. Strong organizational and presentation skills. Excellent interpersonal and communication skills. Ability to multi-task in a fast-paced environment. Flexibility to adapt readily to changing business needs in a fast-paced environment. Team player who is delivery-oriented and takes responsibility for the team’s success. Enthusiastic, can-do attitude with the drive to continually learn and improve. Knowledge of Agile, SCRUM and/or Agile methodologies. Skills Etl,Sql,Azure Show more Show less
Posted 1 week ago
10.0 - 13.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Organization: At CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things. Job Title: Sr Data Engineering Location: Bangalore Business & Team: Technology Team is responsible for the world leading application of technology and operations across every aspect of CommBank, from innovative product platforms for our customers to essential tools within our business. We also use technology to drive efficient and timely processing, an essential component of great customer service. CommBank is recognised as leading the industry in IT and operations with its world-class platforms and processes, agile IT infrastructure, and innovation in everything from payments to internet banking and mobile apps. The Group Security (GS) team protects the Bank and our customers from cyber compromise, through proactive management of cyber security, privacy, and operational risk. Our team includes: Cyber Strategy & Performance Cyber Security Centre Cyber Protection & Design Cyber Delivery Cyber Data Engineering Cyber Data Security Identity & Access Technology The Group Security Senior Data Engineering team provides specialised data services and platforms for the CommBank group & is accountable for developing Group’s data strategy, data policy & standards, governance and set requirements for data enablers/tools. The team is also accountable to facilitate a community of practitioners to share best practice and build data talent and capabilities. Impact & contribution :- To ensure the Group achieves a sustainable competitive advantage through data engineering, you will play a key role in supporting and executing the Group's data strategy. We are looking for an experienced Data Engineer to join our Group Security Team, which is part of the wider Cyber Security Engineering practice. In this role, you will be responsible for setting up the Group Security Data Platform to ingest data from various organizations' security telemetry data, along with additional data assets and data products. This platform will provide security controls and services leveraged across the Group. Roles & Responsibilities You will be expected to perform the following tasks in a manner consistent with CBA’s Values and People Capabilities. CORE RESPONSIBILITIES: Possesses hands-on technical experience working in AWS. The individual should have knowledge about AWS services like EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue, EMR, DynamoDB, IAM, SecretManager, KMS, Step functions, SQS,SNS, Cloud Watch. The individual should possess a robust set of technical and soft skills and be an excellent AWS Data Engineer with a focus on complex Automation and Engineering Framework development. Being well-versed in Python is mandatory, and experience in developing complex frameworks using Python is required. Passionate about Cloud/DevSecOps/Automation and possess a keen interest in solving complex problems systematically. Drive the development and implementation of scalable data solutions and data pipelines using various AWS services. Possess the ability to work independently and collaborate closely with team members and technology leads. Exhibit a proactive approach, constantly seeking innovative solutions to complex technical challenges. Can take responsibility for nominated technical assets related to areas of expertise, including roadmaps and technical direction. Can own and develop technical strategy, overseeing medium to complex engineering initiatives. Essential Skills:- About 10-13 years of experience as a Data Engineering professional in a data-intensive environment. The individual should have strong analytical and reasoning skills in the relevant area. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue, EMR, DynamoDB, IAM, SecretManager, Step functions, SQS,SNS, Cloud Watch. Excellent skills in Python-based framework development are mandatory. Proficiency in SQL for efficient querying, managing databases, handling complex queries, and optimizing query performance. Excellent automation skills are expected in areas such as Automating the testing framework using tools such as PyPy, Pytest, and various test cases including unit, integration, functional tests, and mockups. Automating the data pipeline and expediting tasks such as data ingestion and transformation. API-based automated and integrated calls(REST, cURL, authentication & authorization, tokens, pagination, openApi, Swagger) Implementing advanced engineering techniques and handling ad hoc requests to automate processes on demand. Implementing automated and secured file transfer protocols like XCOM, FTP, SFTP, and HTTP/S Experience with Terraform, Jenkins, Teracity and Artifactory is essential as part of DevOps. Additionally, Docker and Kubernetes are also considered. Proficiency in building orchestration workflows using Apache Airflow. Strong understanding of streaming data processing concepts, including event-driven architectures. Familiarity with CI/CD pipeline development, such as Jenkins. Extensive experience and understanding in Data Modelling, SCD Types, Data Warehousing, and ETL processes. Excellent experience with GitHub or any preferred version control systems. Expertise in data pipeline development using various data formats/types. Mandatory knowledge and experience in big data processing using PySpark/Spark and performance optimizations of applications Proficiency in handling various file formats (CSV, JSON, XML, Parquet, Avro, and ORC) and automating processes in the big data environment. Ability to use Linux/Unix environments for development and testing. Should be aware of security best practices to protect data and infrastructure, including encryption, tokenization, masking, firewalls, and security zones. Well-structured documentation skills and the ability to create a well-defined knowledge base. Certifications such as AWS Certified Data Analytics/Engineer/Developer – Specialty or AWS Certified Solutions Architect. Should be able to perform extreme engineering and design a robust, efficient, and cost-effective data engineering pipelines which are highly available and dynamically scalable on demand. Enable the systems to effectively respond to high demands and heavy loads maintaining the high throughput and high I/O performance with no data loss Own and lead E2E Data engineering life cycle right from Requirement gathering, design, develop, test, deliver and support as part of DevSecOPS process. Must demonstrate skills and mindset to implement encryption methodologies like SSL/TLS and data encryption at rest and in transit and other data security best practices Hands on work experience with data design tools like Erwin and demonstrate the capabilities of building data models, data warehouse, data lakes, data assets and data products Must be able to constructively challenge the status quo and lead to establish data governance, metadata management, ask the right questions, design with right principles Education Qualification :- A Bachelor's or Master's degree in Engineering, specializing in Computer Science, Information Technology or relevant qualifications. If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career. We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696. Advertising End Date: 29/06/2025 Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the within Employee Platforms team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification in software engineering concepts and 2+ years of applied experience. Exposure to product engineering or production/Platform support activities with a good understanding on scalability, security, and reliability. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Demonstrable ability to code in one or more languages Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Emerging knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
India
Remote
Job Title: Data Scientist Location: Remote Job Type: Full-Time | Permanent Experience Required: 4+ Years About the Role: We are looking for a highly motivated and analytical Data Scientist with 4 years of industry experience to join our data team. The ideal candidate will have a strong background in Python , SQL , and experience deploying machine learning models using AWS SageMaker . You will be responsible for solving complex business problems with data-driven solutions, developing models, and helping scale machine learning systems into production environments. Key Responsibilities: Model Development: Design, develop, and validate machine learning models for classification, regression, and clustering tasks. Work with structured and unstructured data to extract actionable insights and drive business outcomes. Deployment & MLOps: Deploy machine learning models using AWS SageMaker , including model training, tuning, hosting, and monitoring. Build reusable pipelines for model deployment, automation, and performance tracking. Data Exploration & Feature Engineering: Perform data wrangling, preprocessing, and feature engineering using Python and SQL . Conduct EDA (exploratory data analysis) to identify patterns and anomalies. Collaboration: Work closely with data engineers, product managers, and business stakeholders to define data problems and deliver scalable solutions. Present model results and insights to both technical and non-technical audiences. Continuous Improvement: Stay updated on the latest advancements in machine learning, AI, and cloud technologies. Suggest and implement best practices for experimentation, model governance, and documentation. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or related field. 4+ years of hands-on experience in data science, machine learning, or applied AI roles. Proficiency in Python for data analysis, model development, and scripting. Strong SQL skills for querying and manipulating large datasets. Hands-on experience with AWS SageMaker , including model training, deployment, and monitoring. Solid understanding of machine learning algorithms and techniques (supervised/unsupervised). Familiarity with libraries such as Pandas, NumPy, Scikit-learn, Matplotlib, and Seaborn. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Key Responsibilities Reporting and Planning Prepare management reporting packs in a timely and clear fashion, with risks, opportunities, and insights identified Provide comprehensive review and commentary at the cost-center level Provide senior leadership monthly financial metrics spanning revenue, P&L, customer margin, product margin, cash flow - covering forecast, actuals, and budget comparisons Drive annual budgets and forecasts, including roll-out of regular reforecasting cycles and processes Manage financial aspects of special projects by determining key drivers, driving the right analyses, and presenting the conclusions Produce models to project long-term growth and determine the impacting business factors Leadership Manage team of FP&A analysts Build cross-functional excellence among departments Be a trusted source of truth to business leaders, partnering to support business needs Drive understanding of business metrics across both revenue and expense trends Present to senior leadership, sharing analysis to enable superior business decisions Process Improvement Support the FP&A transformation agenda with a focus on continuous improvement in efficiency through automation, standardization, and optimization, including choice of tools and methodologies Develop and implement process improvements that scale and amplify the impact of the team’s workstreams Actively participate in the implementation and adherence to new finance requirements, systems, and processes Requirements Bachelor’s degree in Finance, Accounting, or Business-related field with a minimum of 7 years of experience in financial analysis, financial planning, forecasting, or related roles. Experience leading and coaching FP&A teams Intellectually curious with enthusiasm for solving problems and working collaboratively Superior analytical skills and attention to detail, with experience in complex financial modelling Excellent business partner who can build advocacy with stakeholders then drive implementation and track you / your team’s progress Critical thinker who is able to identify solutions to complex problems both independently as well as part of a team. Excellent written and verbal communication skills with the ability to present findings clearly to finance and non-finance stakeholders. Capability to lead global projects and initiatives, meeting tight deadlines in a fast-paced environment. Ability to excel in a highly matrixed organization Experience with automation and data visualization tools Preferred Qualifications CA, CPA or CIMA financial qualifications. Experience in the SaaS or technology industry. Knowledge of securities, financial markets, or fintech, or a strong desire to learn. Experience with accounting software (NetSuite, SAP, etc.), and cloud-based financial planning software (Adaptive Insights), or a willingness to learn. Knowledge of SQL and database querying, or an eagerness to develop these skills. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As an Android Developer Software Engineer II at JPMorgan Chase within the Commercial & Investment Bank Payments Technology team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 2+ years applied experience Extensive experience of using common Android libraries such as Jetpack Compose and Kotlin Coroutines Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language(s) Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Practical knowledge of Android Platform Security best practices Experience of building reusable libraries, components and APIs Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Preferred Qualifications, Capabilities, And Skills Experience working with CI/CD tools like Jenkins Knowledge of Cloud technologies (AWS, Firebase, etc.). Familiarity with Kotlin, Compose and dependency management. Familiarity with web application development. ABOUT US Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years applied experience in Java, AWS, Microservices, Springboot Hands-on experience in AWS EC2, AWS Lambda, AWS KMS, AWS S3, SQS, Event Bridge Proficient in coding in one or more languages Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Hands-on experience in Cassandra, PostgreSQL Hands-on experience with Cloud Infrastructure Provisioning Tools like Terraform & Cloud Formation etc. Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security. Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and community banking technology team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more languages Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description You thrive on diversity and creativity, and we welcome individuals who share our vision of making a lasting impact. Your unique combination of design thinking and experience will help us achieve new heights. As a Data Engineer II at JPMorgan Chase within the Consumer & Community Banking Team, you are part of an agile team that works to enhance, design, and deliver the data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As an emerging member of a data engineering team, you execute data solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Supports review of controls to ensure sufficient protection of enterprise data Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request Updates logical or physical data models based on new use cases Frequently uses SQL and understands NoSQL databases and their niche in the marketplace Adds to team culture of diversity, equity, inclusion, and respect Contributes to software and data engineering communities of practice and events that explore new and emerging technologies Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Required Qualifications, Capabilities, And Skills Formal training or certification on Data Engineering concepts and 3+ years applied experience in AWS and Kubernetes Proficiency in one or more large-scale data processing distributions such as JavaSpark/PySpark along with knowledge on Data Pipeline (DPL), Data Modeling, Data warehouse, Data Migration and so-on. Experience across the data lifecycle along with expertise with consuming data in any of: batch (file), near real-time (IBM MQ, Apache Kafka), streaming (AWS kinesis, MSK) Good at SQL (e.g., joins and aggregations) Working understanding of NoSQL databases Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Significant experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis Experience customizing changes in a tool to generate product Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Experience designing and building REST API services using Java Exposure to cloud technologies - knowledge on Hybrid cloud architectures is highly desirable. AWS Developer/Solutions Architect Certification is highly desired About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Join us as we embark on a journey of collaboration and innovation, where your unique skills and talents will be valued and celebrated. Together we will create a brighter future and make a meaningful difference. As a Lead Data Engineer at JPMorgan Chase within the CCB (Connected Commerce), you are an integral part of an agile team that works to enhance, build, and deliver data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As a core technical contributor, you are responsible for maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Architect and oversee the design of complex data solutions that meet diverse business needs and customer requirements. Guide the evolution of logical and physical data models to support emerging business use cases and technological advancements. Build and manage end-to-end cloud-native data pipelines in AWS, leveraging your hands-on expertise with AWS components. Build analytical systems from the ground up, providing architectural direction, translating business issues into specific requirements, and identifying appropriate data to support solutions. Work across the Service Delivery Lifecycle on engineering major/minor enhancements and ongoing maintenance of existing applications. Conduct feasibility studies, capacity planning, and process redesign/re-engineering of complex integration solutions. Help others build code to extract raw data, coach the team on techniques to validate its quality, and apply your deep data knowledge to ensure the correct data is ingested across the pipeline. Guide the development of data tools used to transform, manage, and access data, and advise the team on writing and validating code to test the storage and availability of data platforms for resilience. Oversee the implementation of performance monitoring protocols across data pipelines, coaching the team on building visualizations and aggregations to monitor pipeline health. Coach others on implementing solutions and self-healing processes that minimize points of failure across multiple product features. Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 5+ years applied experience Extensive experience in managing the full lifecycle of data, from collection and storage to analysis and reporting. Proficiency in one or more large-scale data processing distributions such as JavaSpark along with knowledge on Data Pipeline (DPL), Data Modeling, Data warehouse, Data Migration and so-on. Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more modern programming languages Should have good hands-on experience on AWS services and its components along with good understanding on Kubernetes. Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Strong understanding of domain driven design, micro-services patterns, and architecture Overall knowledge of the Software Development Life Cycle along with experience with IBM MQ, Apache Kafka Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, LLMs etc.) Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Experience designing and building REST API services using Java Exposure to cloud technologies - knowledge on Hybrid cloud architectures is highly desirable. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description As an Software Engineer III - Java & AWS at JPMorgan Chase within the Consumer and Community Banking, specifically on the Payments team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Execute software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Create secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems. Produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development. Gather analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identify hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Contribute to software engineering communities of practice and events that explore new and emerging technologies. Add to team culture of diversity, equity, inclusion, and respect. Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years applied experience. Hands-on practical experience in system design, application development, testing, and operational stability. Proficiency in Java/J2EE and REST APIs, Web Services and experience in building event-driven Micro Services and Kafka streaming. Experience in Spring Framework, Spring Boot and AWS Services in public cloud infrastructure. Experience in developing standard unit testing frameworks, automated functional tests, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Working proficiency in developmental toolsets like GIT/BitBucket, JIRA, Maven Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred Qualifications, Capabilities, And Skills Finance domain experience with United States Banking & Payments industry. Knowledge and working experience on Card Network Payment, Fraud, Settlement. Development experience of Java Micro services application on AWS/Public cloud platform. Certifications in Java programming or related technologies (e.g., Oracle Certified Professional, Spring Certification). Certifications in AWS (e.g., AWS Certified Solutions Architect – Associate). Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description We have an exciting and rewarding opportunity for you to take your Data engineering career to the next level. Be part of a dynamic team where your distinctive skills will contribute to a winning culture and team. As a Data Engineer III at JPMorgan Chase within the Consumer & Community Banking Team, you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. You are responsible for developing, testing, and maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Supports review of controls to ensure sufficient protection of enterprise data Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request Updates logical or physical data models based on new use cases Frequently uses SQL and understands NoSQL databases and their niche in the marketplace Adds to team culture of diversity, equity, inclusion, and respect Contributes to software and data engineering communities of practice and events that explore new and emerging technologies Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Required Qualifications, Capabilities, And Skills Formal training or certification on Data Engineering concepts and 3+ years applied experience in AWS and Kubernetes Proficiency in one or more large-scale data processing distributions such as JavaSpark/PySpark along with knowledge on Data Pipeline (DPL), Data Modeling, Data warehouse, Data Migration and so-on. Experience across the data lifecycle along with expertise with consuming data in any of: batch (file), near real-time (IBM MQ, Apache Kafka), streaming (AWS kinesis, MSK) Advanced at SQL (e.g., joins and aggregations) Working understanding of NoSQL databases Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Significant experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis Experience customizing changes in a tool to generate product Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Experience designing and building REST API services using Java Exposure to cloud technologies - knowledge on Hybrid cloud architectures is highly desirable. AWS Developer/Solutions Architect Certification is highly desired About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.
The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager
Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts
As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.