Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities Snowflake,SQL, Python must have a strong background in data warehousing concepts, ETL development, data modeling, metadata management, and data quality. Implementing ETL pipelines within a data lake using Python and Snowflakes Snow SQL, Snow Pipe . Experience in writing and debugging complex SQL stored procedures . Experience in administering Snowflake, Creation of user accounts and security groups, data loading from datalake/S3 . Querying Snowflake using SQL Development of scripts using Unix, Python, etc. for loading, extracting, and transforming data. Assist with production issues in Enterprise Data Lake like reloading data, transformations, and translations Develop a Database Design and Reporting Design based on Business Intelligence and Reporting requirements Mandatory Skill Sets Snowflake, SQL, Python Preferred Skill Sets Snowflake, SQL, Python Years Of Experience Required 2-4 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Snowflake Schema Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 7 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities Snowflake,SQL, Python must have a strong background in data warehousing concepts, ETL development, data modeling, metadata management, and data quality. Implementing ETL pipelines within a data lake using Python and Snowflakes Snow SQL, Snow Pipe . Experience in writing and debugging complex SQL stored procedures . Experience in administering Snowflake, Creation of user accounts and security groups, data loading from datalake/S3 . Querying Snowflake using SQL Development of scripts using Unix, Python, etc. for loading, extracting, and transforming data. Assist with production issues in Enterprise Data Lake like reloading data, transformations, and translations Develop a Database Design and Reporting Design based on Business Intelligence and Reporting requirements Mandatory Skill Sets Snowflake, SQL, Python Preferred Skill Sets Snowflake, SQL, Python Years Of Experience Required 2-4 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Snowflake Schema Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 7 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
5.0 years
1 - 10 Lacs
Hyderābād
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience in Java, AWS, Springboot Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more languages Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies
Posted 1 week ago
5.0 years
10 - 12 Lacs
Hyderābād
On-site
SQL Developer with SSIS (ETL Developer) Location: Hyderabad (Hybrid Model) Experience Required: 5+ Years Joining Timeline: Immediate to 20 Days Role Type: Individual Contributor (IC) Position Summary We are seeking a skilled SQL Developer with strong SSIS expertise to join a dynamic team supporting a leading US-based banking client. This is a hybrid role based in Hyderabad, suited for professionals experienced in building scalable, auditable ETL pipelines and collaborating within Agile teams. Must-Have SkillsSkillProficiencySQL Development Expert in writing complex T-SQL queries, stored procedures, joins, transactions. Proficient in handling error logging and audit logic for production-grade environments. ETL using SSIS Strong experience in designing, implementing, and debugging SSIS packages using components like script tasks, event handlers, and nested packages. Batch Integration Hands-on experience in managing high-volume batch data ingestion from various sources using SSIS, with performance and SLA considerations. Agile Delivery Actively contributed to Agile/Scrum teams, participated in sprint planning, code reviews, demos, and met sprint commitments. Stakeholder Collaboration Proficient in engaging with business/product owners for requirement gathering, transformation validation, and output review. Excellent communication skills required. Key Responsibilities Design and develop robust, auditable SSIS workflows based on business and data requirements. Ensure efficient deployment and maintenance using CI/CD tools like Jenkins or UCD. Collaborate with stakeholders to align solutions with business needs and data governance standards. Maintain and optimize SQL/SSIS packages for production environments ensuring traceability, performance, and error handling. Nice-to-Have SkillsSkillDetailCloud ETL (ADF) Exposure to Azure Data Factory or equivalent ETL tools. CI/CD (Jenkins/UCD) Familiar with DevOps deployment tools and pipelines. Big Data (Spark/Hadoop) Understanding or integration experience with big data systems. Other RDBMS (Oracle/Teradata) Experience in querying and integrating data from additional platforms. Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Benefits: Health insurance Paid sick time Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Experience: SQL: 5 years (Preferred) ETL using SSIS: 5 years (Preferred) CI/CD (Jenkins, UCD): 5 years (Preferred) Work Location: In person
Posted 1 week ago
3.0 years
6 - 9 Lacs
Hyderābād
On-site
Key Responsibilities : Design, construct, and maintain efficient and reliable data pipelines to support both traditional data warehousing and real-time data processing needs. Implement ETL (Extract, Transform, Load) processes and data workflows that are scalable and optimized for performance. Manage data lakes, data warehouses, and databases, ensuring data quality, consistency, and security. Index and catalog unstructured data to enable advanced analytics and machine learning capabilities. Administer vector databases for AI/ML applications, optimizing for query performance and scalability. Securely integrate REST APIs and external data sources into our systems, maintaining compliance with data governance standards. Work with AI/ML teams to leverage OpenAI, Azure, and other LLM endpoints, ensuring seamless data integration and pipeline compatibility. Contribute to the creation of synthetic data sets for robust model training, employing data anonymization and augmentation techniques where necessary. Utilize prompt engineering to refine the inputs and outputs of generative AI models, enhancing their utility and accuracy. Monitor, troubleshoot, and optimize data systems to ensure high availability and minimal downtime. Research and implement best practices in data engineering, staying abreast of new technologies and methodologies in the field. Skills & Qualifications Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field. Minimum of 3 years of experience in data engineering, with a proven track record of successful data pipeline development. Expertise in SQL and NoSQL databases, data modeling, and data warehousing solutions. Proficiency in programming languages such as Python and scripting languages for automation. Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and understanding of distributed computing principles. Familiarity with vector databases and their role in supporting AI/ML workloads. Demonstrated experience with REST API development and consumption, with a focus on security and data protection. Practical knowledge of cloud services and infrastructure as it relates to data engineering (e.g., AWS, Azure, GCP). Understanding of machine learning concepts and experience with LLMs is a plus. Excellent analytical and problem-solving abilities, with meticulous attention to detail. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Additional Desirable Skills Certifications in cloud platforms, big data technologies, or related fields. Experience with data governance, compliance, and privacy regulations. Familiarity with DevOps practices and tools for data operations. Active engagement with the data engineering community through blogs, talks, or open-source contributions. Qualifications : Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. Strong experience with Python and Node.js in a production environment. Proficiency in working with vector embeddings and understanding their role in AI applications. Experience with Elastic or similar databases for storing and querying large datasets. Familiarity with AI frameworks and libraries, particularly those related to NLP and chatbot development. Solid understanding of RESTful API design and development. Knowledge of cloud services and deployment (AWS, Azure, GCP). Proficient in using version control systems like Git. Prior experience building conversational AI or chatbots. Understanding of machine learning operations (MLOps) and deployment of AI models. Experience with containerization and orchestration tools (Docker, Kubernetes). Familiarity with CI/CD pipelines and automated testing frameworks.
Posted 1 week ago
2.0 years
3 - 9 Lacs
Mumbai
On-site
You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Consumer & Community Banking Rewards Team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Design & develop data pipelines end to end using PySpark, Java, Python and AWS Services. Utilize Container Orchestration services including Kubernetes, and a variety of AWS tools and services. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2 years of applied experience. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark & Spark Streaming. Proficient in coding in one or more Coding languages – Core Java, Python and Pyspark Experience with Relational and Datawarehouse databases, Cloud implementation experience with AWS including: AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager Proficiency in automation and continuous delivery methods. Preferred qualifications, capabilities, and skills Experience in Snowflake nice to have. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security. In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS.
Posted 1 week ago
3.0 - 6.0 years
3 - 4 Lacs
Bengaluru
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description (Shaded areas for Talent use only) We are seeking a passionate data analyst to transform data into actionable insights and support decision-making in a global organization focused on pricing and commercial strategy. This role spans business analysis, requirements gathering, data modeling, solution design, and visualization using modern tools. The analyst will also maintain and improve existing analytics solutions, interpret complex datasets, and communicate findings clearly to both technical and non-technical audiences. Essential Functions of the Job: (Identify and describe essential functions, or primary duties and responsibilities. Each function should describe WHAT is done and the END RESULT/PURPOSE achieved. Assume that the reader does not know the role or function of the job.) Analyze and interpret structured and unstructured data using statistical and quantitative methods to generate actionable insights and ongoing reports. Design and implement data pipelines and processes for data cleaning, transformation, modeling, and visualization using tools such as Power BI, SQL, and Python. Collaborate with stakeholders to define requirments, prioritize business needs, and translate problems into analytical solutions. Develop, maintain, and enhance scalable analytics solutions and dashboards that support pricing strategy and commercial decision-making. Identify opportunities for process improvement and opperational efficiency through data-driven recommendations. Communicate complex findings in a clear, compelling, and actionable manner to both technical and non-technical audiences. Analytical/Decision Making Responsibilities: (Describe the kind of problems and challenges typically faced, and decisions required to perform the job, as well as recommendations made to supervisors or others. Focus on the nature of existing policies, precedents and procedures used to guide decisions, and the degree to which the incumbent is free to make decisions requiring interpretation and judgment. Provide an example.) Apply a hypothesis-driven approach to analyzing ambiguous or complex data and synthesizing insights to guide strategic decisions. Promote adoption of best practices in data analysis, modeling, and visualization, while tailoring approaches to meet the unique needs of each project. Tackle analytical challenges with creativity and rigor, balancing innovative thinking with practical problem-solving across varied business domains. Prioritize work based on business impact and deliver timely, high-quality results in fast-paced environments with evolving business needs. Demonstrate sound judgement in selecting methods, tools, and data sources to support business objectives. Knowledge and Skills Requirements: (Describe the knowledge or skills needed to perform this job; these may be professional, technical, or managerial) Proven experience as a data analyst, business analyst, data engineer, or similar role. Strong analytical skills with the ability to collect, organize, analyze, and present large datasets accurately. Foundational knowledge of statistics, including concepts like distributions, variance, and correlation. Skilled in documenting processes and presenting findings to both technical and non-technical audiences. Hands-on experience with Power BI for designing, developing, and maintaining analytics solutions. Proficient in both Python and SQL, with strong programming and scripting skills. Skilled in using Pandas, T-SQL, and Power Query M for querying, transforming, and cleaning data. Hands-on experience in data modeling for both transactional (OLTP) and analytical (OLAP) database systems. Strong visualization skills using Power BI and Python libraries such as Matplotlib and Seaborn. Experience with defining and designing KPIs and aligning data insights with business goals. Additional/Optional Knowledge and Skills: (Describe any additional knowledge or skills that, while not required, may be useful or helpful to perform this job; these may be professional, technical, or managerial) Experience with the Microsoft Fabric data analytics environment. Proficiency in using the Apache Spark distributed analytics engine, particularly via PySpark and Spark SQL. Exposure to implementing machine learning or AI solutions in a business context. Familiarity with Python machine learning libraries such as scikit-learn, XGBoost, PyTorch, or transformers. Experience with Power Platform tools (Power Apps, Power Automate, Dataverse, Copilot Studio, AI Builder). Knowledge of pricing, commercial strategy, or competitive intelligence. Experience with cloud-based data services, particularly in the Azure ecosystem (e.g., Azure Synapse Analytics or Azure Machine Learning). Supervision Responsibilities: (Describe the level of supervision received, i.e., the frequency of supervisory contact, degree to which the individual acts independently and on what kinds of issues. Describe the level of supervision of others, if any, i.e., assigning work, reviewing performance, direct or indirect responsibility). Operates with a high degree of independence and autonomy. Collaborates closesly with cross-functional teams including sales, pricing, and commercial strategy. Mentors junior team members, helping develop technical skills and business domain knowledge. Other Requirements: (Describe any miscellaneous functions or expectations of the job that are important to note) Collaborates with a team operating primarily in the Eastern Time Zone (UTC 4:00 / 5:00). Limited travel may be required for this role. Job Requirements: Education: (What is the minimum level of education or equivalent experience needed/suggested to perform this job effectively?) A bachelor’s degree in a STEM field relevant to data analysis, data engineering, or data science is required. Examples include (but are not limited to) computer science, statistics, data analytics, artificial intelligence, operations research, or econometrics. Experience: (What is the minimum number or range of years needed to perform this job?) 3–6 years of experience in data analysis, data engineering, or a closely related field, ideally within a professional services enviornment. Certification Requirements: (Describe and explain any certifications and/or licenses needed or helpful to perform this job). No certifications are required for this role. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
2.0 years
7 - 9 Lacs
Bengaluru
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY- Global Delivery Services (GDS) – Consulting – People Consulting (PC) – Work Force Management (WFM) – Consultant Managing the global workforce in today’s fast changing and highly disruptive environment is becoming increasingly complex. As a member of our PC practice, you’ll be part of a team that supports clients in aligning their HR function with the organizational plans, while keeping employee experience as one of the core considerations. When you join us, you will gain cross functional, multi industry and a truly global work experience. The opportunity We are looking for Consultant with expertise in WFM to join the PC team. This is a fantastic opportunity to be part of a leading global professional services organisation whilst being instrumental in the growth of the PC team. key responsibilities Support client projects leveraging deep knowledge and understanding of Time Management, Attendance, Leave Management, Labour Scheduling and other components of Workforce management processes and systems Work on client projects as part of a globally distributed team Ensure high quality deliverables are produced for the project with exhaustive internal reviews and obtain excellent feedback from the client and global project counterparts Participate in full project life cycle activities (Discovery, Design, Configuration, Build, Testing, Knowledge Transfer, Migration and Postproduction Support) Support development of thought leadership, collateral, tools, techniques and methodologies to build and enhance Workforce management service offerings within the practice Manage and support EY initiatives within the practice Support to drive effective client communication, cadence and build relations with client and project team counterparts across global locations Skills and attributes for success Integrity and commitment to work in a new and challenging environment Ability to manage ambiguity and to be proactive Strong communication and presentation skills Cross cultural awareness and sensitivity High energy, agility and adaptability Ability to maintain positive attitude towards receiving feedback and ongoing training Open to travel for projects that are approved per EY and country specific travel advisories To qualify for the role, you must have 2-4 years of experience in Workforce management (Time management, attendance, scheduling, etc.) Experience in configuring Kronos Workforce central modules (Timekeeping, Accruals, Attendance, Attestation etc.,) on v7+ and above Experience with Kronos Workforce Integration Manager (WIM); experience with designing/building and maintaining integrations Knowledge of relational databases; Database querying - Ability to write complex SQL statements; MS SQL/Oracle Proven ability to apply leading practices to the software development life cycle based on experiences with agile and blended approaches Experience with file transfer protocol, e.g. FTP, sFTP, EDI, etc Knowledge of data manipulation & transformation - ability to manipulate different formats of data including flat files, csv/txt, lookup tables, etc Overall knowledge of how integrations interact with Kronos workforce central suite of software Understanding of business requirements and translating them into technical requirements Knowledge and experience of end to end Work Force Management process Experience in process mapping and documentation using Visio and other tools Ideally, you may also have Certification in Dell Boomi and Workforce Dimensions Experience with Microsoft Visual Studio, RDL, and/or SSRS for reports Experience working on other WFM products i.e. Ceridian, ADP E-time or any other WFM product Experience working on policy, process and design of the overall WFM solution Knowledge of payroll What we look for Knowledge and experience of working in a cross-cultural setup Strong desire to learn and demonstrate examples of change management deliverables What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work on inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
2.0 years
10 Lacs
Chennai
On-site
Job Description Software Engineer, Chennai Our NielsenIQ Technology teams are working on revamping multiple platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on NielsenIQ’s data and insights to innovate and grow. As a Python Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and prototyping cutting-edge technologies. As a market research company, we had lot of data analytics, machine learning requirements being implemented in various applications. Our CDAR Platform had various usecase which incorporated AI/ML and Data analytics. Our team is co-located and agile, with central technology hubs in Chicago, Madrid, Toronto, Chennai and Pune Responsibilities Understanding user needs and how they fit into the overall, global solution design Prototyping new features and integrations aligned to business strategy by introducing innovation through technology Following source & test-driven development best practices Troubleshooting and identifying root cause analysis while resolving the issues Write complex, maintainable code to develop scalable, flexible, and user-friendly applications. Importing/Collecting, cleaning, converting and analyzing the data for the purpose of finding insights and making conclusions. Train models, fine tune parameters for maximum efficiency and deploy models. Actively participate in building algorithms for solving complex problems with design and development. Take ownership of the projects and ensure timely deliveries. Collaborate with diverse teams across time zones. Qualifications Minimum of 2 years of experience in large scale production systems and in languages such as Python/R Minimum B.S. degree in Computer Science, Computer Engineering or related field with focus in machine learning Strong software engineering skills and understanding of the ML lifecycle with a minimum of 2 years' experience in ML production systems and in software development Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas Fluent in processing data with pandas (e.g., querying, transforming, joining, cleaning, etc.) including experience debugging logic and performance issues Strong understanding of machine learning algorithms with experience writing, debugging, and optimizing ML data structures, pipelines, and transformations Knowledge of statistics, probability, or a related discipline Extensive data modelling and data architecture skills Strong knowledge of version control tools, preferably Bit bucket Basic Knowledge on Linux/Unix environment (basic commands, shell scripting, etc.) Demonstrated ability to work as part of a Global Team Strong troubleshooting and problem-solving skills Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Preferred Qualifications: Bachelor’s degree or equivalent in Computer Science or a related field with a focus in machine learning Experience using Collaboration Technologies: Azure DevOps, TFS, Jira, Confluence Experience using Atlassian tool suite, including JIRA, Confluence, BitBucket Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 1 week ago
1.0 years
0 - 0 Lacs
India
On-site
Experienced Laravel Developer (West Bengal Candidates apply only) Job description · Discussing project aims with the client and development team. · Designing and building web applications using Laravel. · Troubleshooting issues in the implementation and debug builds. · Working with back-end developers on projects. · Testing functionality for users and the backend. · Ensuring that integrations run smoothly. · Scaling projects based on client feedback. · Recording and reporting on work done in Laravel. · Maintaining web-based applications. · Presenting work in meetings with clients and management. Laravel Developer Requirements: · A degree in programming, computer science, or a related field. · Experience working with PHP, performing unit testing, and managing APIs such as REST. · A solid understanding of application design using Laravel. · Knowledge of database design and querying using SQL. · Proficiency in HTML and JavaScript. · A portfolio of applications and programs to your name. · Problem-solving skills and critical mind-sets. · Great communication skills. · The desire and ability to learn. · Problem Solving Skill, Server maintenance. Required Skill sets:- · The minimum experience required is 1-2 years · Experience in PHP 7+ and Laravel 6+ · Experience with React/ Angular / Node will be an added advantage. · Hands-on project experience in the Laravel framework. · Good knowledge of jQuery, Ajax, REST API · Proficiency in SQL scripting and MySQL 5. x. An advanced level of SQL (Stored Procedure/Trigger/Functions) will be an added advantage. · Good understanding of object-oriented principles, and MVC design patterns. Role: Laravel Developer Industry Type: IT Services Education:-Post Graduate or graduate, BCA, MCA, Btech, Bsc etc. Salary: Rs 14000 to Rs 20000 per month Email id:-talentacquisition@devantitsolutions.com Contact Number-7605083834 immediate joining Job Type: Full-time Pay: ₹14,000.00 - ₹20,000.00 per month Schedule: Day shift Fixed shift Supplemental Pay: Yearly bonus Experience: Laravel: 2 years (Preferred) Language: Bengali (Preferred) English (Preferred) Work Location: In person Application Deadline: 17/06/2025 Expected Start Date: 18/06/2025
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are seeking a skilled and motivated Software Engineer with strong experience in Google Cloud Platform (GCP) and Python programming. In this role, you will be responsible for designing, developing, and maintaining scalable and reliable cloud-based solutions, data pipelines, or applications on GCP, leveraging Python for scripting, automation, data processing, and service integration. Responsibilities Design, implement, and manage scalable, secure, and reliable infrastructure on Google Cloud Platform (GCP) using Infrastructure as Code (IaC) principles, primarily with Terraform. Develop and manage APIs or backend services in Python deployed on GCP services like Cloud Run Function, App Engine, or GKE. Build and maintain robust CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitHub) to enable frequent and reliable application deployments. Work closely with software development teams to understand application requirements and translate them into cloud-native solutions on GCP. Implement and manage monitoring, logging, and alerting solutions (e.g., Cloud Monitoring, Prometheus, Grafana, Cloud Logging) to ensure system health and performance. Implement and enforce security best practices within the GCP environment (e.g., IAM policies, network security groups, security scanning). Troubleshoot and resolve production issues across various services (Applications) and infrastructure components (GCP). Work closely with product manager and business stakeholders to understand the business needs and associated systems requirements to meet customization required in SaaS Solution. Run and protect the SaaS Solution in AWS Environment and troubleshoots production issues. Active participant in all team agile ceremonies, manage the daily deliverables in Rally with proper user story and acceptance criteria. Provides input to product governance communications. Qualifications Bachelor’s degree in computer science or engineering 3+ plus years of software development and support experience including analysis, design, & testing. Domain experience within Automotive, Manufacturing and Supply chain Strong proficiency in Python programming, including experience with standard libraries and popular frameworks/libraries (e.g., Pandas, NumPy, FastAPI, Flask, Django, Scikit-learn, TensorFlow/PyTorch - depending on the role). Hands-on experience designing, deploying, and managing resources and services on Google Cloud Platform (GCP). Familiarity with database querying (SQL) and understanding of database concepts. Understanding of cloud architecture principles, including scalability, reliability, and security. Proven experience working effectively within an Agile development or operations team (e.g., Scrum, Kanban). Experience using incident tracking and project management tools (e.g., Jira, ServiceNow, Azure DevOps). Excellent verbal and written communication skills, with the ability to explain technical issues clearly to both technical and non-technical audiences. Excellent teamwork, written and verbal communication, and organizational skills are essential, ability to solve complex problems in a global environment Ability to multi-task effectively and prioritize work based on business impact and urgency Nice-to-Have Skills: GCP certifications (e.g., Associate Cloud Engineer, Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud providers (AWS, Azure). Experience with containerization (Docker) and orchestration (Kubernetes). Experience with database administration (e.g., PostgreSQL, MySQL). Familiarity with security best practices and tools in a cloud environment (DevSecOps). Experience with serverless technologies beyond Cloud Functions/Run. Contribution to open-source projects. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Senior Engineer, Data Modeling Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What We OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 Years Of Relevant Work Experience Is Required. Experience with stakeholder management is an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for capacity planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications and research modeling to optimize operation processes. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, scientists, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. You will also improve the efficiency of capital investment by helping the fulfillment centers to improve storage utilization and the effective use of automation. Finally, you will help create the metrics to quantify improvements to the fulfillment costs (e.g., transportation and labor costs) resulting from the application of these optimization models and tools. Major Responsibilities Include Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. Apply Statistical and Machine Learning methods to specific business problems and data. Create global standard metrics across regions and perform benchmark analysis. Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. Develop efficient data querying and modeling infrastructure. Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. Utilizing code (Python, R, Scala, etc.) for analyzing data and building statistical models. Basic Qualifications 2+ years of data scientist experience 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience Experience applying theoretical models in an applied environment Preferred Qualifications Experience in Python, Perl, or another scripting language Experience in a ML or data scientist role with a large technology company Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ATSPL - Telangana Job ID: A3003398 Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
Google BigQuery, Apache Airflow Must Have 3 Cloud Platforms GCP (primary), AWS (optional), GCS/S3 Must Have 4 ETL & Pipelines ETL/ELT concepts, scalable pipeline design, automation, DAG scheduling using Airflow Must Have 5 Programming & Querying Python, Web API Development Must Have 6 Data Integration & Migration Salesforce, MySQL, SQL Server, SFTP, SharePoint, Email integrations Must Have 7 Data Security & Encryption End-to-end encryption, secure data exports/imports Must Have 8 Version Control & CI/CD GitLab, Bitbucket, CI/CD pipelines Good to Have 9 Business Intelligence Tools DOMO, Looker, Tableau Good to Have 10 Object-Oriented & Java Basics Basic OOP concepts, Java familiarity Good to Have 11 Soft Skills Communication, cross-functional collaboration, detail orientation Must Have Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Our company is seeking a seasoned Senior Data Engineer to join our dynamic team. In this role as a Senior Data Engineer, you will concentrate on data integration and ETL projects for cloud-based systems. Your primary responsibilities will be to design and implement sophisticated data solutions, ensuring data accuracy, reliability, and accessibility. Responsibilities Design and implement sophisticated data solutions for cloud-based systems Develop ETL workflows utilizing SQL, Python, and additional relevant technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Collaborate with cross-functional teams to gauge data integration needs and specifications Create and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Monitor and refine data integration processes to boost performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or related fields 5-8 years of experience in data engineering Experience using cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Proficiency in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good oral and written communication skills in English at a B2 level Nice to have Proficiency in ETL using Python Show more Show less
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
Jadavpur, Kolkata, West Bengal
On-site
Experienced Laravel Developer (West Bengal Candidates apply only) Job description · Discussing project aims with the client and development team. · Designing and building web applications using Laravel. · Troubleshooting issues in the implementation and debug builds. · Working with back-end developers on projects. · Testing functionality for users and the backend. · Ensuring that integrations run smoothly. · Scaling projects based on client feedback. · Recording and reporting on work done in Laravel. · Maintaining web-based applications. · Presenting work in meetings with clients and management. Laravel Developer Requirements: · A degree in programming, computer science, or a related field. · Experience working with PHP, performing unit testing, and managing APIs such as REST. · A solid understanding of application design using Laravel. · Knowledge of database design and querying using SQL. · Proficiency in HTML and JavaScript. · A portfolio of applications and programs to your name. · Problem-solving skills and critical mind-sets. · Great communication skills. · The desire and ability to learn. · Problem Solving Skill, Server maintenance. Required Skill sets:- · The minimum experience required is 1-2 years · Experience in PHP 7+ and Laravel 6+ · Experience with React/ Angular / Node will be an added advantage. · Hands-on project experience in the Laravel framework. · Good knowledge of jQuery, Ajax, REST API · Proficiency in SQL scripting and MySQL 5. x. An advanced level of SQL (Stored Procedure/Trigger/Functions) will be an added advantage. · Good understanding of object-oriented principles, and MVC design patterns. Role: Laravel Developer Industry Type: IT Services Education:-Post Graduate or graduate, BCA, MCA, Btech, Bsc etc. Salary: Rs 14000 to Rs 20000 per month Email id:-talentacquisition@devantitsolutions.com Contact Number-7605083834 immediate joining Job Type: Full-time Pay: ₹14,000.00 - ₹20,000.00 per month Schedule: Day shift Fixed shift Supplemental Pay: Yearly bonus Experience: Laravel: 2 years (Preferred) Language: Bengali (Preferred) English (Preferred) Work Location: In person Application Deadline: 17/06/2025 Expected Start Date: 18/06/2025
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role Overview: The Snowflake Data Engineer will be responsible for designing, developing, and maintaining scalable data solutions on the Snowflake platform. The ideal candidate will have a solid understanding of data warehousing principles, advanced SQL skills, and experience with Snowflake-specific features and best practices. key Responsibilities: • Data Integration & ETL Processes: o Design and implement ETL processes to extract, transform, and load data from various sources into Snowflake. o Develop and maintain data pipelines using Snowflake’s native features and third-party tools. • Data Modeling: o Create and maintain data models and schemas that optimize performance and facilitate business intelligence. o Develop and implement data warehousing solutions that support analytical needs. • Performance Optimization: o Monitor and optimize the performance of Snowflake queries and processes. o Implement best practices for data loading, querying, and indexing to ensure efficient data processing. • Collaboration & Support: o Work closely with data analysts, data scientists, and other stakeholders to understand their data needs and provide effective solutions. o Troubleshoot and resolve data-related issues in collaboration with other team members. • Documentation & Compliance: o Document data processes, schemas, and solutions to ensure clear understanding and knowledge sharing. o Ensure compliance with data governance policies and security standards. Qualifications: • Experience: o Minimum of 4 - 6 years of experience as a Data Engineer or in a similar role. o Proven experience with Snowflake data warehousing solutions and ETL processes. • Technical Skills: o Strong proficiency in SQL and Snowflake-specific SQL functions. o Experience with Snowflake’s architecture, data modeling, and performance optimization. o Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and their integration with Snowflake. • Soft Skills: o Excellent problem-solving and analytical skills. o Strong communication and collaboration abilities. o Ability to work independently and manage multiple tasks effectively. • Education: o Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field. Preferred Qualifications: • Certifications related to Snowflake or cloud platforms (e.g., SnowPro Certification). • Experience with data visualization tools such as Power BI. • Knowledge of programming languages like Python or Java or C#. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Designation: Product Analyst – Conversion & Pricing Optimization Location: Bangalore Experience: 2–4 years Job Summary We are looking for a highly analytical and outcome-oriented Product Analyst to join our product team and drive continuous improvements in our app and web conversion funnel. You will collaborate with cross-functional teams—including Product, Engineering, Category, and Marketing—to uncover insights, design pricing and funnel experiments, and deliver measurable business impact. Key Responsibilities Conversion Funnel Ownership: Analyze end-to-end user journeys to identify drop-off points, friction areas, and funnel inefficiencies across web and app platforms. Recommend and implement strategies to improve flow from discovery to checkout. Pricing Intelligence & Yield Strategy: Develop and refine algorithmic pricing logic to maximize yield, improve affordability perception, and increase conversion. Analyze the impact of pricing, discounts, and offers on user behavior and overall revenue. Marketing Conversion Analysis: Evaluate the effectiveness of marketing campaigns and landing pages in driving conversions. Collaborate with marketing teams to optimize attribution, targeting, and ROI from performance channels. Category Page & PDP Optimization: Monitor and improve engagement metrics and conversion on category pages and product detail pages (PDPs). Propose experiments around layout, filters, recommendation logic, and content to enhance performance. Experimentation & A/B Testing: Design, run, and interpret A/B tests related to funnel steps, pricing changes, and UI/UX modifications. Present findings with statistical confidence to influence product decisions. Analytics & Reporting: Build dashboards and automated reports to track key KPIs such as funnel performance, pricing yield, discount ROI, and engagement. Continuously monitor anomalies and opportunities using Mixpanel, GA, and internal data tools. Skill Set Requirements Data Analysis & Querying: Strong command of SQL for data extraction and transformation. Working knowledge of Python or R is a plus. Analytics Tools: Experience with product analytics platforms like Mixpanel, Amplitude, CleverTap, or Google Analytics. Proficiency in data visualization tools such as Tableau, Looker, Power BI, or internal BI tools. Experimentation & Statistical Skills: Solid understanding of A/B testing, hypothesis testing, and statistical significance. Experience designing and interpreting controlled experiments and measuring impact on KPIs. Pricing & Business Modeling (Preferred): Understanding of pricing elasticity, yield optimization, and margins. Exposure to algorithmic or rule-based pricing systems is a strong plus. Product & UX Sensibility: Ability to understand user behavior across app and web journeys. Experience identifying conversion bottlenecks and proposing UX changes. Soft Skills: Structured problem-solving and analytical thinking. Strong communication and storytelling through data. Ability to work independently and with cross-functional stakeholders. Qualifications 2–4 years of experience in product, data, or business analysis in a digital-first/e-commerce environment. Bachelor’s degree in Engineering, Mathematics, Statistics, Economics, or related fields. Skills: mixpanel,structured problem-solving,tableau,amplitude,r,pricing elasticity,a/b testing,analytical thinking,data,clevertap,power bi,python,google analytics,user behavior analysis,sql,analytics,communication,storytelling through data,looker,yield optimization,pricing analysis,statistical significance Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
SQL Developer with SSIS (ETL Developer) Location: Hyderabad (Hybrid Model) Experience Required: 5+ Years Joining Timeline: Immediate to 20 Days Role Type: Individual Contributor (IC) Position Summary We are seeking a skilled SQL Developer with strong SSIS expertise to join a dynamic team supporting a leading US-based banking client. This is a hybrid role based in Hyderabad, suited for professionals experienced in building scalable, auditable ETL pipelines and collaborating within Agile teams. Must-Have Skills SkillProficiency SQL Development Expert in writing complex T-SQL queries, stored procedures, joins, and transactions. Proficient in handling error logging and audit logic for production-grade environments. ETL using SSIS Strong experience in designing, implementing, and debugging SSIS packages using components like script tasks, event handlers, and nested packages. Batch Integration Hands-on experience in managing high-volume batch data ingestion from various sources using SSIS, with performance and SLA considerations. Agile Delivery Actively contributed to Agile/Scrum teams, participated in sprint planning, code reviews, demos, and met sprint commitments. Stakeholder Collaboration Proficient in engaging with business/product owners for requirement gathering, transformation validation, and output review. Excellent communication skills required. Key Responsibilities Design and develop robust, auditable SSIS workflows based on business and data requirements. Ensure efficient deployment and maintenance using CI/CD tools like Jenkins or UCD. Collaborate with stakeholders to align solutions with business needs and data governance standards. Maintain and optimize SQL/SSIS packages for production environments ensuring traceability, performance, and error handling. Nice-to-Have Skills SkillDetail Cloud ETL (ADF) Exposure to Azure Data Factory or equivalent ETL tools. CI/CD (Jenkins/UCD) Familiar with DevOps deployment tools and pipelines. Big Data (Spark/Hadoop) Understanding or integration experience with big data systems. Other RDBMS (Oracle/Teradata) Experience in querying and integrating data from additional platforms. Apply here-sapna@helixbeat.com Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Charles Technologies is a dynamic startup based in Chennai, dedicated to creating innovative mobile applications that transform user experiences. We are looking for a talented and experienced MERN Stack Developer to join our team and lead the development of innovative web and mobile applications. Qualifications: Education: BE in Computer Science, Information Technology, or B.Tech in an IT-related field is required. A Master’s degree is a plus. Relevant certifications are also a plus. Experience: Minimum of 2 years of total experience in full stack application development. Extensive experience working with startups, small teams, and in fast-paced environments is highly desirable. Foundational Knowledge: Strong understanding of software engineering principles, product development, and web/mobile application development best practices. Technical Skills: JavaScript : Expert-level proficiency in JavaScript, including ES6+ features, asynchronous programming, and modern frameworks .React Native : Extensive experience in developing cross-platform mobile applications using React Native, including performance optimization and native module integration React : Advanced expertise in React for front-end development, including hooks, context API, state management libraries like Redux, and component lifecycle management Node.js : Solid knowledge of Node.js for backend development, including experience with Express.js, RESTful API design, and asynchronous programming patterns Azure Cosmos DB : Extensive experience with Azure Cosmos DB for scalable and efficient data management, including partitioning, indexing, querying, and performance tuning Azure Cloud Services : Proficiency in deploying and managing applications on Azure Cloud Services, including Azure App Services, Azure Functions, Azure Storage, and monitoring tools Git : Proficient in version control systems like Git, including branching, merging strategies, pull request workflows, and conflict resolution Azure DevOps : Experience with Azure DevOps for CI/CD pipelines, project management, automated testing, and release management API Integration : Experience in integrating RESTful APIs and third-party services, including OAuth, JWT, and other authentication and authorization mechanisms UI/UX Design : Understanding of UI/UX design principles and ability to collaborate with designers to implement responsive, accessible, and user-friendly interfaces Responsibilities Full Stack Development : Develop and maintain high-quality web and mobile applications using React Native, React, and Node.js, ensuring code quality, performance, and scalability Backend Development : Implement backend services and APIs using Node.js, ensuring scalability, security, and maintainability Database Management : Manage and optimize databases using Azure Cosmos DB, including data modelling, indexing, partitioning, and performance tuning .Version Control : Use Git for version control, including branching, merging, and pull request workflows. Conduct peer code reviews to ensure code quality and share knowledge with team members CI/CD Pipelines : Set up and maintain CI/CD pipelines using Azure DevOps, including automated testing, deployment, monitoring, and rollback strategies Peer Code Reviews : Participate in peer code reviews to ensure adherence to coding standards, identify potential issues, and share best practices Performance Optimization : Optimize application performance and ensure responsiveness across different devices and platforms, including profiling, debugging, and performance tuning Collaboration : Work closely with designers, product owners, and other developers to deliver high-quality applications. Participate in agile development processes, including sprint planning, stand-ups, and retrospectives Testing and Debugging : Conduct thorough testing and debugging to ensure the reliability and stability of applications, including unit testing, integration testing, and end-to-end testing Documentation : Create and maintain comprehensive documentation for code, APIs, and development processes, including technical specifications and user guides Continuous Improvement : Stay updated with the latest industry trends and technologies, and continuously improve development practices. Participate in knowledge-sharing sessions and contribute to the growth of the team Perks & Benefits Central Location : Conveniently located in the heart of the city, with parking facilities and well-served by public transport including buses and Chennai Metro Meals and Refreshments : Lunch, tea/coffee, snacks, and refreshments provided throughout the day Insurance : TATA AIG Family Group Insurance for INR 5.0 Lakhs (Coverage: Self + Spouse + Up to 3 Children) Professional Development : Opportunities for continuous learning and growth Team Outings and Events : Regular team-building activities and events Employee Recognition : Programs to acknowledge and reward outstanding performance How to Apply : Interested candidates can apply through LinkedIn or email us at careers@charles-technologies.com. Join us at Charles Technologies and be a part of a team that is shaping the future of mobile applications! Show more Show less
Posted 1 week ago
0.0 years
0 Lacs
Telangana
On-site
1)Bachelor’s degree 2)12-24 months of work experience. 3)Good communication skills - Trans Ops Representative will be facilitating flow of information between external 4)Proficiency in Excel (pivot tables, vlookups) 5)Demonstrated ability to work in a team in a very dynamic environment NOC (Network Operation Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving NOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, NOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon’s ability to serve its customers on time. Purview of a Trans Ops Representative: A Trans Ops Representative at NOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at NOC works across two verticals – Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Representative provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Key job responsibilities Trans Ops Representative should be able to ideate process improvements and should have the zeal to drive them to conclusion. Responsibilities include, but are not limited to: Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. Graduate with Bachelor’s degree Good logical skills Good communication skills - Trans Ops Representative will be facilitating flow of information between different teams Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Request Id 304431-1 Assignment Sponsor Ansari, Yeheya LOB ET Job Position (As per Beeline) Full Stack Developer - Advanced Qty (no of position) 1 Job Location (City and building location) Embassy Tech Village - Parcel 5, Bangalore Duration of contract (Min 6 months) 8 Months Shift Timing ( From IST – To IST) 11AM-8PM Reason for hire New project hire Experience Required (Total And Relevant) 5+ Years Education Qualification Bachelors Target Organization/Domain/Industry Finance, Risk Management(Optional) No of interview Rounds 2 rounds(1F2F) Interview Panel Name and availability Primary Skills (Mandatory Top 3 Skills) Java, ReactJS, AWS Secondary Skills (Good To Have) Hybrid (3 days WFO) / 5 days WFO 5days WFO CTC / Daily Bill Rate ( INR) 18,215.00 INR Laptop & Wifi Laptop and WIFI mandatory for All Roles And Responsibility/Job Description Formal training or certification on software engineering concepts and 5+ years applied experience with Java 8 or above, Spring Framework (Security, Boot, JPA, REST), Microservices Hands-on practical experience in system design, application development, testing, and operational stability Hands-on Experience with AWS & EKS Working knowledge of Unix, Databases, SQL Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Demonstrable ability to code in one or more languages Experience Across The Whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Applicant Resiliency, and Security Experience with ReactJS is must Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Skills: C#, Entity Framework, SQL Server, RESTful APIs, Web API, Language Integrated Query (LINQ), Dependency Injection, AngularJS, Company Overview Chiselon Technologies Pvt Ltd is a dynamic software development and consulting firm with a talented team of 11-50 employees. Headquartered in Hyderabad, Chiselon is recognized for delivering cutting-edge IT services and consulting solutions. Explore more about us at www.chiselontechnologies.com. Job Overview We are seeking a Senior Dotnet Developer to join our team in Hyderabad or Bengaluru. This is a full-time, hybrid role, ideal for a professional with expertise in developing web applications using .NET technologies. Qualifications And Skills Proficiency in C# programming language, with a strong understanding of object-oriented principles. Experience in designing and implementing APIs using RESTful services for scalable applications. Hands-on experience with Entity Framework for data manipulation and transaction management. Strong familiarity with SQL Server for database design, optimization, and query execution. In-depth knowledge of Language Integrated Query (LINQ) for effective data querying in C#. Expertise in using dependency injection frameworks to enhance code testability and maintenance. Demonstrated ability to work with Web API (Mandatory skill) and architect reliable web services. Proficient in AngularJS (Mandatory skill) for developing dynamic client-side applications. Roles And Responsibilities Design, develop, test, and maintain web applications using .NET and associated technologies. Collaborate with cross-functional teams to determine application requirements and specifications. Implement data storage solutions using SQL Server, ensuring efficiency and scalability. Utilize RESTful APIs to create seamless interaction between server and client-side applications. Provide technical oversight and mentorship to junior developers and team members. Ensure software quality through rigorous testing and code reviews, adhering to best practices. Contribute to architectural discussions, ensuring robust and efficient application design. Stay updated with emerging industry trends and technologies, integrating relevant advancements into projects. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Full-time Job Description Software Engineer, Chennai Our NielsenIQ Technology teams are working on revamping multiple platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on NielsenIQ’s data and insights to innovate and grow. As a Python Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and prototyping cutting-edge technologies. As a market research company, we had lot of data analytics, machine learning requirements being implemented in various applications. Our CDAR Platform had various usecase which incorporated AI/ML and Data analytics. Our team is co-located and agile, with central technology hubs in Chicago, Madrid, Toronto, Chennai and Pune Responsibilities Understanding user needs and how they fit into the overall, global solution design Prototyping new features and integrations aligned to business strategy by introducing innovation through technology Following source & test-driven development best practices Troubleshooting and identifying root cause analysis while resolving the issues Write complex, maintainable code to develop scalable, flexible, and user-friendly applications. Importing/Collecting, cleaning, converting and analyzing the data for the purpose of finding insights and making conclusions. Train models, fine tune parameters for maximum efficiency and deploy models. Actively participate in building algorithms for solving complex problems with design and development. Take ownership of the projects and ensure timely deliveries. Collaborate with diverse teams across time zones. Qualifications Minimum of 2 years of experience in large scale production systems and in languages such as Python/R Minimum B.S. degree in Computer Science, Computer Engineering or related field with focus in machine learning Strong software engineering skills and understanding of the ML lifecycle with a minimum of 2 years' experience in ML production systems and in software development Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas Fluent in processing data with pandas (e.g., querying, transforming, joining, cleaning, etc.) including experience debugging logic and performance issues Strong understanding of machine learning algorithms with experience writing, debugging, and optimizing ML data structures, pipelines, and transformations Knowledge of statistics, probability, or a related discipline Extensive data modelling and data architecture skills Strong knowledge of version control tools, preferably Bit bucket Basic Knowledge on Linux/Unix environment (basic commands, shell scripting, etc.) Demonstrated ability to work as part of a Global Team Strong troubleshooting and problem-solving skills Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Preferred Qualifications: Bachelor’s degree or equivalent in Computer Science or a related field with a focus in machine learning Experience using Collaboration Technologies: Azure DevOps, TFS, Jira, Confluence Experience using Atlassian tool suite, including JIRA, Confluence, BitBucket Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion I'm interested I'm interested Privacy Policy Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.
The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager
Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts
As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.