Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Company Description NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. Job Description Develop and implement data quality strategies and roadmaps Establish and maintain a robust data quality framework Implement and maintain data quality assurance processes Monitor and report on data quality metrics Set up automation based on data quality rules, processes, and best practices Investigate and perform root cause analysis of data-related issues, collaborating with relevant teams to determine underlying causes and implement effective solutions Create and maintain documentation for data quality initiatives Collaborate with stakeholders to define data quality standards and metrics Provide guidance, mentorship, and training to team members Support team members Qualifications 3+ years of relevant work experience in data analysis, data quality assurance, data governance Experience in implementing and maintaining data quality frameworks (Airflow DAGs, Grafana, Great Expectations, DBT DQ Tools) with effort and cost-effectiveness in mind Proficiency in SQL and Python, or similar languages for querying and manipulating data Experience with automation approach for AWS/DBT/Snowflake stack. Experience in automation tools Experience in the setup of automation framework and CI/CD Pipeline Strong understanding of data quality principles and methodologies, data profiling, generation, validation, scorecards Experience with ETL (Extract, Transform, Load) processes and tools Strong analytical and problem-solving skills, with an attention to detail DB (relational, document) Ability to create comprehensive documentation of data quality processes, standards, and issues Ability to convey complex technical concepts to both technical and non-technical stakeholders The ability to work with big data technologies Proficiency in Snowflake’s SQL/Data Bricks/Big Query syntax and unique features Additional Information NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Asset & Wealth Management, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 2+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Demonstrable ability to code in one or more languages Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Emerging knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: ETL Tester About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job Title: ETL Tester Location: Pune Exp-5+years Key Responsibilities: Extensive experience in validating ETL processes, ensuring accurate data extraction, transformation, and loading across multiple environments. Proficient in Java programming, with the ability to understand and write Java code when required. Advanced skills in SQL for data validation, querying databases, and ensuring data consistency and integrity throughout the ETL process. Expertise in utilizing Unix commands to manage test environments, handle file systems, and execute system-level tasks. Proficient in creating shell scripts to automate testing processes, enhancing productivity and reducing manual intervention. Ensuring that data transformations and loads are accurate, with strong attention to identifying and resolving discrepancies in the ETL process. Focused on automating repetitive tasks and optimizing testing workflows to increase overall testing efficiency. Write and execute automated test scripts using Java to ensure the quality and functionality of ETL solutions. Utilize Unix commands and shell scripting to automate repetitive tasks and manage system processes. Collaborate with cross-functional teams, including data engineers, developers, and business analysts, to ensure the ETL processes meet business requirements. Ensure that data transformations, integrations, and pipelines are robust, secure, and efficient. Troubleshoot data discrepancies and perform root cause analysis for failed data loads. Create comprehensive test cases, execute them, and document test results for all data flows. Actively participate in the continuous improvement of ETL testing processes and methodologies. Experience with version control systems (e.g., Git) and integrating testing into CI/CD pipelines. Tools & Technologies (Good to Have): Experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, and Spark for handling large-scale data processing and storage. Knowledge of NiFi for automating data flows, transforming data, and integrating different systems seamlessly. Experience with tools like Postman, SoapUI, or RestAssured to validate REST and SOAP APIs, ensuring correct data exchange and handling of errors. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
India
Remote
Halo believes in innovation by inclusion to solve digital problems . As an international agency of over 200 people specializing in interactive media strategy and development, we embrace equity and empowerment in a serious way. Our interdisciplinary teams of unique designers, developers and entrepreneurial minds with a variety of backgrounds, viewpoints, and skills connect to solve business challenges of every shape and size. We empathize to form deep, meaningful relationships with our clients so they can do the same with their audience. Working at Halo feels like belonging . Learn more about our philosophy, benefits, and team at https://halopowered.com/As an AI Architect, you will lead the design of scalable, secure, and modern technology solutions, leveraging artificial intelligence, cloud platforms, and microservices—while ensuring alignment with AI governance principles, agile delivery, and platform modernization strategies As a Data Scientist, you'll be part of a multidisciplinary team applying advanced analytics, machine learning, and generative AI to solve real-world problems across our consulting, health, wealth, and career businesses. You will collaborate closely with engineering, product, and business stakeholders to develop scalable models, design intelligent pipelines, and influence data-driven decision-making across the enterprise. Requirements Design, develop, and deploy robust machine learning models and data pipelines that support AI-enabled applications Apply exploratory data analysis (EDA) and feature engineering techniques to extract insights and improve model performance Collaborate with cross-functional teams to translate business problems into analytical use cases Contribute to the full machine learning lifecycle: from data preparation and model experimentation to deployment and monitoring Work with structured and unstructured data, including text, to develop NLP and generative AI solutions Define and enforce best practices in model validation, reproducibility, documentation, and versioning Partner with engineering to integrate models into production systems using CI/CD pipelines and cloud-native services Stay current with industry trends, emerging techniques (e.g., RAG, LLMs, embeddings), and relevant tools Required Skills & Qualifications 3+ years of experience in Data Science, Machine Learning, or Applied AI roles Proficiency in Python (preferred) and a strong grasp of pandas, NumPy, and scikit-learn Skilled in data querying, manipulation, and pipeline development using SQL and modern ETL frameworks Experience working with Databricks, including notebooks, MLflow, Delta Lake, and job orchestration Experience with Git-based workflows and Agile methodologies Strong analytical thinking, problem-solving skills, and communication abilities Exposure to Generative AI, LLMs, prompt engineering, or vector-based search Hands-on experience with cloud platforms (AWS, Azure, or GCP) and deploying models in scalable environments Knowledge of data versioning, model registry, and ML lifecycle tools (e.g., MLflow, DVC, SageMaker, DataBricks, or Vertex AI) Experience working with visualization tools like Tableau, Power BI, or Qlik Degree in Computer Science, Data Science, Applied Mathematics, or a related field Benefits 100% RemoteWork Salary in USD Get to work on challenging projects for the U.S Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chandigarh, India
On-site
Job Summary JOB DESCRIPTION We are seeking a results-oriented and strategic team leader of Business Analytics to lead our Reporting Analytics team in harnessing data to drive business success. This role requires a blend of analytical expertise, leadership capabilities, and a deep understanding of business operations. The ideal candidate will empower teams to extract meaningful insights from data, support key initiatives, and foster a data-driven culture across the organization. Key Responsibilities Team Leadership and Development: Recruit, train, and develop a high-performing analytics team, providing coaching and mentorship to foster professional growth. Conduct regular performance reviews, setting clear objectives and providing feedback to enhance team effectiveness. Responsible for deliverables of Reporting Analytics team, includes leading team of 2 - 3 Associates Process Manager and a span of 10-15 member team of Reporting Analysts and Sr. Reporting Analysts Data Strategy And Governance Define and execute a comprehensive data analytics strategy that aligns with organizational goals and industry best practices. Establish data governance protocols to ensure data quality, security, and compliance with relevant regulations. Business Insights And Analysis Partner with stakeholders to understand their needs, challenges, and goals, translating these into analytical projects that drive value. Conduct exploratory data analysis to uncover trends, patterns, and insights that inform strategic decisions. Reporting, Visualization, And Communication Design and implement effective reporting frameworks, utilizing advanced data visualization tools to present findings to diverse audiences. Prepare and deliver presentations to senior leadership, articulating insights and recommendations in a clear and actionable manner. Performance Metrics And Dashboards Develop and monitor dashboards and performance metrics to track the effectiveness of business initiatives, providing timely updates to stakeholders. Identify opportunities for improving operational efficiency through data-driven recommendations. Project Management Lead cross-functional analytics projects, coordinating resources and timelines to ensure successful project delivery. Manage project budgets and resources effectively, ensuring alignment with strategic priorities. Continuous Improvement And Innovation Stay abreast of industry trends, emerging technologies, and best practices in analytics, applying this knowledge to enhance team capabilities. Foster a culture of continuous improvement by encouraging team members to explore new methodologies and technologies. Leadership And Interpersonal Skills Strong leadership qualities with the ability to influence and inspire cross-functional teams. Excellent interpersonal skills with a knack for building relationships at all levels of the organization. Analytical And Problem-Solving Skills Strong analytical mindset with the ability to think critically and strategically to solve complex business problems. Technical Skills Data Analysis and reporting Tools: Database Management: Data Visualization: ETL Processes: Data Warehousing: Scripting Languages: Must have: Proficiency in Microsoft Excel, PowerPoint, VBA and SQL including advanced functions and data visualization features. Good to have: Knowledge of BI tools such as SSIS Packages, Tableau or Power BI for creating interactive dashboards and reports. Good to have: Understanding of statistical methods, Forecasting and Predictive analysis knowledge Must have: Strong knowledge of SQL for querying databases and extracting data. Must have: Familiarity with database management systems like Microsoft SQL Server Must have: Proficiency in visualization tools like Excel dashboard, PowerPoint, Tableau to create meaningful visual representations of data. Ability to create visually appealing and informative reports and dashboards that convey insights effectively. Understanding of how to design effective visualizations and best practices in data visualization design. Must have: knowledge of Extract, Transform, Load (ETL) processes and tools (e.g. SSIS packages) for data integration and preparation. Good to have: Understanding of data warehousing concepts and architecture. Good to have: Experience with data warehouse technologies and methodologies. Must have: Proficiency in scripting languages like VBA and SQL query for automating reporting tasks and data manipulation. About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are looking for a skilled and proactive Software Engineer to join our IoT engineering team, focused on building intelligent asset tracking systems using BLE beacons and RFID sensors. The ideal candidate will have solid experience in BLE technology, RFID technology, and integration of sensors using GPIO terminals, backend development using Java Spring Boot, Python scripting, and algorithm design for real-time asset localization and monitoring. This role is ideal for individuals passionate about IoT systems and edge software-to-cloud integration. Responsibilities Design and develop BLE and RFID reader-integrated asset tracking solutions. Interface with RFID readers and GPIO-based sensors to monitor asset movement and presence. Develop a scalable application using Java Spring Boot to manage device communication, data ingestion, and user-facing APIs. Implement advanced signal processing and filtering algorithms (e. g., MAD, trilateration, interference detection) for accurate asset location estimation. Integrate the BLE gateway and RFID sensor data into a real-time asset tracking system. Configure GPIO pins for sensor input, RFID status tracking, and alert generation. Work with MySQL and H2 databases for data storage, querying, and analytics. Develop automated tests and diagnostics to ensure system reliability and robustness. Collaborate with hardware, network, and data science teams for end-to-end solution delivery. Requirements 2+ years of software development experience with Java (Spring Boot) and Python. Hands-on experience with BLE beacons and RFID reader integration, including working with GPIO interfaces. Strong understanding of signal processing techniques such as RSSI filtering, trilateration, and proximity estimation. Experience integrating hardware (RFID, sensors) with software systems and protocols. Proficient in MySQL, and optionally H2 database for embedded or lightweight deployments. Strong debugging, problem-solving, and algorithmic thinking skills. This job was posted by Ruchi Banthiya Head Human Resources At Prode from ProdEx Technologies. Show more Show less
Posted 2 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Purpose We, at Jet2 (UK’s third largest airlines and the largest tour operator), have set up a state-of-the-art Technology and Innovation Centre in Pune, India. We are looking for a Test Lead to join our team. Responsibilities Hands-on experience in Manual testing with strong knowledge of Content Management System – Sitecore Strategize the UI automation scope and ensure smooth execution Process oriented person capable of decision making. Prior experience in leading QA teams with a track record of successfully validating and delivering high-quality and scalable products. Solid experience in full stack web application testing Experience with software testing metrics, Test Planning, Different levels, types and methods of Testing Strong attention to detail and quality, and the ability to direct the team to adopt best practices. Ability to adapt quickly to changes and maintain high team morale and efficiency. Ability to work collaboratively with teams to address their needs, motivate them throughout the project, and identify opportunities to add their support directly when needed. Ability to provide oversight and analysis of work estimates and defend and articulate work efforts. Ensure the team adheres to established test case standardization & best practices. Ability to coordinate training sessions for senior and junior team members. Ability to work collaboratively in teams with other specialized individuals. Instill team empowerment, ownership, and accountability. Able to work in a fast-paced, technical environment with minimal oversight and in a professional manner. Ability to measure team and individual’s performance through standardized key performance indicators, and the ability to push the quality bar higher. Staying on top of cutting-edge testing techniques and trends, implementing the technologies, and ensuring higher quality products on projects. Build test plans, test scenarios, and test data to support development projects and project requirements, and design documents. Identify regression testing needs and create and maintain a regression suite. Work as part of cross-functional teams spread across India and the UK for developing and implementing Test best practices in an Agile environment. Essential Qualifications 10+ years of QA Engineering and strong knowledge of Content Management System (Preferably Sitecore) Proficient in any programming language (Ideally Typescript) and experience in any automation tool preferably Playwright 2+ years managing software testing teams and writing clear, concise, and comprehensive test strategies and plans 2+ Years working in DevOps and/or Agile environment Good knowledge of different .net frameworks and Model/View/Controller (MVC) architecture Good understanding of database querying using SQL Server. Experience in Web Application Testing, and System Integration Testing. Prior experience working with the UK teams. Good analysis and troubleshooting skills. Experience working with the following: Agile/SCRUM/Kanban methodologies Maintain and develop pertinent operational statistics, and results reporting Support and contribute to Business Development initiatives Research escalated issues to deliver coaching opportunities Bachelors/Masters/Doctorate in Computer Science or equivalent Perform other duties as assigned Nice To Have Exposure to Playwright based UI automation using Typescript Exposure to SAFE methodology Prior experience with Azure or any other cloud platforms, TFS , Octopus is a plus. Exposure to the containerized deployment model Ability to work in a cross-functional and cross-domain environment and drive discussions with various leaders across the company Show more Show less
Posted 2 weeks ago
2.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Requirements 2-4 years of Analytics with predominant experience in SQL, Python, Tableau, and Google Sheets. Experienced in writing complex SQL select queries (window functions and CTEs) with advanced SQL experience. Strong in querying logic and data interpretation. Solid communication, Business Acumen, and articulation skills. Able to handle stakeholders independently with fewer interventions. Develop strategies to solve problems in logical yet creative ways. We Would Be Excited If You Had Excellent communication and interpersonal skills. Ability to meet deadlines and manage project delivery. Excellent report-writing and presentation skills. Critical thinking and problem-solving capabilities. This job was posted by Aditya Vignesh from Indium Software. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Get to know Okta Okta is The World’s Identity Company. We free everyone to safely use any technology—anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we’re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We’re building a world where Identity belongs to you. The Opportunity: Are you a data-driven storyteller with a passion for transforming raw information into actionable insights that drive tangible business outcomes? Do you thrive on collaborating directly with business stakeholders to understand their needs and then architecting elegant data solutions? If so, we have an exciting opportunity for a highly skilled and motivated Manager-level Business Analytics to join our growing team in Bangalore. In this role, you will be instrumental in empowering our business users with the data and visualizations they need to make informed decisions, analyze and drive improvements to our Finance operational performance, business decisions, and strategy. You will drive the analytics lifecycle, from initial consultation to the delivery of impactful dashboards and data sets. Please note - this role operates to PST, so you will work from 5pm - 2am IST. This role is hybrid, with the expectation of working from our local office on specified days based on local expectations. We will continuously assess this arrangement, and it may be subject to change based on business needs and evolving circumstances. What You'll Do: Strategic Alignment: Align analytics initiatives with key business objectives and contribute to the development of data-driven strategies that lead to measurable improvements. Become a Trusted Advisor: Partner closely with business users across various departments to understand their strategic objectives, identify their analytical requirements, and translate those needs into clear and actionable data and reporting solutions. Consultative Analysis: Engage with stakeholders to explore their business questions, guide them on appropriate analytical approaches, and help them define key metrics and performance indicators (KPIs). Data Architecture & Design: Partner with our Data and Insights team to design and develop robust and efficient data models and datasets optimized for visualization and analysis, ensuring data accuracy and integrity. Expert Tableau Development: Leverage your deep expertise in Tableau to create intuitive, interactive, and visually compelling dashboards and reports that effectively communicate key insights and trends. Data Wrangling & Transformation: Utilize Fivetran and/or Python scripting to extract, transform, and load data from various sources into our data warehouse or analytics platforms. End-to-End Ownership: Take full ownership of the analytics projects you lead, from initial scoping and data acquisition to dashboard deployment, user training, and ongoing maintenance. Drive Data Literacy: Educate and empower business users to effectively utilize dashboards and data insights to drive business outcomes. Stay Ahead of the Curve: Continuously explore new data visualization techniques, analytical methodologies, and data technologies to enhance our analytics capabilities. Collaborate and Communicate: Effectively communicate complex analytical findings and recommendations to stakeholders. Lead cross-functional collaborations to achieve project goals. Data Governance & Quality: Ensure data accuracy, consistency, and integrity in all developed datasets and dashboards, contributing to data governance efforts. Performance Monitoring & Iteration: Monitor the performance and user adoption of developed dashboards, gather feedback, and implement necessary revisions for continuous improvement. Documentation & Training: Develop comprehensive documentation for created dashboards and datasets. Provide training and support to business users to ensure effective utilization of analytics tools. What You'll Bring: 5-7+ years of experience in a Business Analytics, Data Analytics, or similar role supporting Finance teams with increasing responsibility. Proven experience working directly with business stakeholders to understand their needs and deliver data-driven solutions. Expert-level proficiency in Tableau, including advanced calculations, parameters, actions, and performance optimization. Strong hands-on experience in building and optimizing data sets for Tableau. Solid experience with data integration tools, preferably Fivetran, and the ability to design and implement data pipelines. Proficiency in Python for data manipulation, cleaning, and transformation (e.g., using libraries like Pandas). Strong understanding of data warehousing principles and experience with platforms such as Snowflake or Redshift. Advanced SQL skills for data extraction, transformation, and querying. Excellent problem-solving and analytical skills with a strong attention to detail and the ability to translate business questions into analytical frameworks. Exceptional written and verbal communication skills, with the ability to present complex data insights effectively to both technical and non-technical audiences. Proven ability to manage multiple analytics projects simultaneously, prioritize tasks, and meet deadlines effectively. Excellent collaboration and interpersonal skills with the ability to build strong working relationships with business stakeholders. Bachelor's degree in a quantitative field such as Engineering, Finance/Accounting/ Business, Economics, Statistics, Mathematics, Computer Science, or a related discipline. (Master's degree a plus). Bonus Points For: Experience in SaaS software industry is highly preferred. Experience with other data visualization tools (e.g., Power BI, Looker). Familiarity with cloud-based data platforms (e.g., BigQuery). Familiarity with basic statistical concepts and methodologies is a plus. What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/. Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
DESCRIPTION NOC (Network Operations Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving NOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, NOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon’s ability to serve its customers on time. Within NOC’s umbrella, resides a fast-growing Last Mile support function – AMZL CO (Amazon Logistics Central Operations). AMZL CO is a team focused on driving higher quality at lower cost through standard work leveraging central management of the network. Central Operations (CO) supports daily planning and execution functions that impact Delivery Station (DS) operations across the AMZL and EDSP/XPT network. CO aims to bring efficiencies to processes through standardization, programmatic interventions and automations that improve planning, scheduling and routing efficiencies, reduce cost and free up time for station operators to focus on operational work. We cover the following functional areas with global parity: (i) Central Allocation - removes operator judgement on channel allocation by planning via O-TREAT (4 week to 1 week ahead) & 24 hour forecasting based D-1 capacity adjustments, (ii) Centralized Routing and Scheduling (CRS) – executes block scheduling (1 week ahead, D-1 block release) and route planning (D-day) of on-road capacity centrally, (iii) CO Systems Management (COSM) - performs station jurisdiction and sector configurations via JAS (Jurisdiction Authority Service), and handles sort & route planning configurations, (iv) Driver Support (CO DS) – aims to streamline the delivery process for DSPs and drivers by coordinating rescues through global tools - Rescue Planner (RP) & Mission Control (MC) and, (v) providing channel support for DSP, Flex and Hub DP along with account and payment management – WST entry validation, invoicing and weather incentives. CO team embarked on the journey of becoming operations execution partner of NA and EU COs in Jun’21 with an immediate objective of leveraging people cost benefits through targeted offshoring and in the long term, standardizing AMZL CO processes and technology in NA and EU and RoW (Rest of World) countries to establish worldwide parity, providing a platform for knowledge sharing and building a hybrid structure for local innovation and speed to market while optimizing gearing ratios and cost structures. We named the broader program MARCOPOLO. Marcopolo Vision: NOC’s vision is to build a global Center of Excellence by being the prime provider of Last Mile Central Operations (CO) execution services to NA, EU and RoW marketplaces in next 3 years. This org will - 1) provide 24x7 coverage to all geographies, 2) leverage centralization at scale to optimize HC through improved Operator Utilization by unlocking synergies across time zones, 3) ensure at par or better SLA and quality by closely monitoring audit performance, 4) enable operational parity and standardization across workstreams and geographies, 5) leverage in-house automation team to automate manual execution, 6) work closely with in-country program and operations teams to provide inputs on large scale process improvement programs including hands-off-the-wheel automations, 7) support global expansion and standardization, leverage learnings and best practices across geographies and 8) facilitate joint OP request submission exercises to product and tech teams by incorporating use cases across geographies. Purview of a Trans Ops Specialist A Trans Ops Specialist at NOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at NOC works across two verticals – Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Specialist provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Trans Ops Specialist should be able to ideate process improvements and should have the zeal to drive them to conclusion. We are open to hiring candidates to work out of Hyderabad and willing to come to office all 5 working days of the week Key job responsibilities Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Ability to pull data from numerous databases (using Excel, Access, SQL and/or other data management systems) and to perform ad hoc reporting and analysis as needed is a plus. Develop and/or understand performance metrics to assist with driving business results. Ability to scope out business and functional requirements for the Amazon technology teams who create and enhance the software systems and tools are used by NOC. Must be able to quickly understand the business impact of the trends and make decisions that make sense based on available data. Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. About The Team NOC (Network Operation Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving, we understand trends in network exceptions and automate processes or proposing process changes to streamline operations involving network monitoring and significant analysis of network data. Key job responsibilities Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Ability to pull data from numerous databases (using Excel, Access, SQL and/or other data management systems) and to perform ad hoc reporting and analysis as needed is a plus. Develop and/or understand performance metrics to assist with driving business results. Ability to scope out business and functional requirements for the Amazon technology teams who create and enhance the software systems and tools are used by NOC. Must be able to quickly understand the business impact of the trends and make decisions that make sense based on available data. Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. About The Team NOC (Network Operation Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving, we understand trends in network exceptions and automate processes or proposing process changes to streamline operations involving network monitoring and significant analysis of network data. BASIC QUALIFICATIONS Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics PREFERRED QUALIFICATIONS Experience with Excel Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2873972 Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Charles Technologies is a dynamic startup based in Chennai, dedicated to creating innovative mobile applications that transform user experiences. We are looking for a talented and experienced MERN Stack Developer to join our team and lead the development of innovative web and mobile applications. Qualifications: Education: BE in Computer Science, Information Technology, or B.Tech in an IT-related field is required. A Master’s degree is a plus. Relevant certifications are also a plus. Experience: Minimum of 2 years of total experience in full stack application development. Extensive experience working with startups, small teams, and in fast-paced environments is highly desirable. Foundational Knowledge: Strong understanding of software engineering principles, product development, and web/mobile application development best practices. Technical Skills: JavaScript : Expert-level proficiency in JavaScript, including ES6+ features, asynchronous programming, and modern frameworks .React Native : Extensive experience in developing cross-platform mobile applications using React Native, including performance optimization and native module integration React : Advanced expertise in React for front-end development, including hooks, context API, state management libraries like Redux, and component lifecycle management Node.js : Solid knowledge of Node.js for backend development, including experience with Express.js, RESTful API design, and asynchronous programming patterns Azure Cosmos DB : Extensive experience with Azure Cosmos DB for scalable and efficient data management, including partitioning, indexing, querying, and performance tuning Azure Cloud Services : Proficiency in deploying and managing applications on Azure Cloud Services, including Azure App Services, Azure Functions, Azure Storage, and monitoring tools Git : Proficient in version control systems like Git, including branching, merging strategies, pull request workflows, and conflict resolution Azure DevOps : Experience with Azure DevOps for CI/CD pipelines, project management, automated testing, and release management API Integration : Experience in integrating RESTful APIs and third-party services, including OAuth, JWT, and other authentication and authorization mechanisms UI/UX Design : Understanding of UI/UX design principles and ability to collaborate with designers to implement responsive, accessible, and user-friendly interfaces Responsibilities Full Stack Development : Develop and maintain high-quality web and mobile applications using React Native, React, and Node.js, ensuring code quality, performance, and scalability Backend Development : Implement backend services and APIs using Node.js, ensuring scalability, security, and maintainability Database Management : Manage and optimize databases using Azure Cosmos DB, including data modelling, indexing, partitioning, and performance tuning .Version Control : Use Git for version control, including branching, merging, and pull request workflows. Conduct peer code reviews to ensure code quality and share knowledge with team members CI/CD Pipelines : Set up and maintain CI/CD pipelines using Azure DevOps, including automated testing, deployment, monitoring, and rollback strategies Peer Code Reviews : Participate in peer code reviews to ensure adherence to coding standards, identify potential issues, and share best practices Performance Optimization : Optimize application performance and ensure responsiveness across different devices and platforms, including profiling, debugging, and performance tuning Collaboration : Work closely with designers, product owners, and other developers to deliver high-quality applications. Participate in agile development processes, including sprint planning, stand-ups, and retrospectives Testing and Debugging : Conduct thorough testing and debugging to ensure the reliability and stability of applications, including unit testing, integration testing, and end-to-end testing Documentation : Create and maintain comprehensive documentation for code, APIs, and development processes, including technical specifications and user guides Continuous Improvement : Stay updated with the latest industry trends and technologies, and continuously improve development practices. Participate in knowledge-sharing sessions and contribute to the growth of the team Perks & Benefits Central Location : Conveniently located in the heart of the city, with parking facilities and well-served by public transport including buses and Chennai Metro Meals and Refreshments : Lunch, tea/coffee, snacks, and refreshments provided throughout the day Insurance : TATA AIG Family Group Insurance for INR 5.0 Lakhs (Coverage: Self + Spouse + Up to 3 Children) Professional Development : Opportunities for continuous learning and growth Team Outings and Events : Regular team-building activities and events Employee Recognition : Programs to acknowledge and reward outstanding performance How to Apply : Interested candidates can apply through LinkedIn or email us at careers@charles-technologies.com. Join us at Charles Technologies and be a part of a team that is shaping the future of mobile applications! Show more Show less
Posted 2 weeks ago
45.0 years
5 - 8 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-213123 LOCATION: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Apr. 29, 2025 CATEGORY: Information Systems ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 45 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: We are seeking a highly skilled and experienced Test Automation Engineering Manager to lead our automation team. The ideal candidate will have expertise in data automation, especially with Databricks and AWS, and be skilled in search-related programs, Data catalog, and UI validation. you will play a pivotal role in shaping the quality and reliability of complex, search-driven applications that handle large-scale data ingestion and real-time querying. This is a highly hands-on leadership role , ideal for someone who enjoys diving deep into technical challenges while also mentoring and guiding QA strategies at scale. You will be responsible for defining and executing end-to-end test strategies— from backend content crawling, document indexing, API interaction, to UI presentation and search experience . You'll work closely with cross-functional teams including backend engineers, frontend developers, data engineers, DevOps, and product owners , ensuring that all components of the system—from data ingestion (via Java-based crawlers and S3 document pipelines) to frontend search display (built on React and GraphQL)—function seamlessly and perform reliably under real-world loads. In this role, you are expected to be a quality champion , not just ensuring functional correctness but also owning performance, usability, and scalability aspects of search testing. You’ll be at the intersection of search technology , cloud platforms , and UI/UX , driving excellence through hands-on implementation and strategic leadership. Roles & Responsibilities: Hands-On Testing & Automation Design, implement, and maintain comprehensive test strategies across UI, backend, and data layers of search-driven platforms. Perform hands-on testing of React-based UIs integrated with GraphQL APIs, ensuring a seamless and accurate search experience for end-users. Develop and maintain automated test suites using tools like Cypress, Playwright, or Selenium, integrated into CI/CD pipelines. Create robust GraphQL API test scenarios to validate search results, metadata mapping, and performance under various data loads. Search Engine & Data Flow Testing Validate integration of custom search engines (e.g., GCP Search Engine) with frontend interfaces. Test and ensure end-to-end search result accuracy—from Java-based web crawlers, S3 document ingestion, through to frontend UI. Verify the ingestion, parsing, indexing, and retrieval accuracy of documents stored in Amazon S3, including testing of content structure, metadata extraction, and search visibility. Collaborate with developers to test the effectiveness and coverage of Java crawlers, including content freshness, crawl depth, and data completeness. Technical Leadership, Strategy & Team Collaboration Define and drive the overall QA and testing strategy for UI and search-related components with a focus on scalability, reliability, and performance. Contribute to system architecture and design discussions , bringing a strong quality and testability lens early into the development lifecycle. Lead test automation initiatives , introducing best practices and frameworks that align with modern DevOps and CI/CD environments. Mentor and guide QA engineers , fostering a collaborative, growth-oriented culture focused on continuous learning and technical excellence. Collaborate cross-functionally with product managers, developers, and DevOps to align quality efforts with business goals and release timelines. Conduct code reviews, test plan reviews, and pair-testing sessions to ensure team-level consistency and high-quality standards. Monitoring, Metrics & Continuous Improvement Define and track key quality metrics such as search accuracy, indexing delays, UI responsiveness, and test coverage. Drive continuous improvement initiatives in testing practices, tools, and frameworks. Participate in production validations, incident reviews, and apply learnings to build more resilient systems. Quality Monitoring & Continuous Improvement Define and track key quality metrics such as search accuracy, UI responsiveness, indexing delays, and automation coverage to ensure high product quality. Drive continuous improvement initiatives by identifying process gaps, enhancing test tools, and evolving testing strategies based on production feedback. Ensure robust release readiness by conducting risk assessments, regression testing, and cross-functional validation across the release cycle. Collaborate with DevOps to maintain reliable CI/CD pipelines that support automated testing, fast feedback, and post-release monitoring. Good-to-Have Skills: Familiarity with distributed systems, databases, and large-scale system architectures. Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops knowledge of search-related programming and algorithms. Experience working with agile Testing methodologies such as Scaled Agile. Must-Have Skills: 10–14 years of QA experience with a strong focus on frontend, backend, and data-centric application testing. Hands-on experience with UI testing of modern frontend applications built in React.js. Strong knowledge of GraphQL APIs — including schema validation, query testing, and performance benchmarking. Proven experience testing custom search engine implementations, preferably on Google Cloud Platform (GCP) or similar. Deep understanding of document ingestion pipelines and metadata validation using Amazon S3 or other object stores. Familiarity with Java-based web crawlers (e.g., Apache Nutch or in-house frameworks) testing content coverage, freshness, and crawl performance. Proficiency in test automation tools such as Cypress, Playwright, or Selenium — including scripting and CI/CD integration. Experience with CI/CD tools like Jenkins, GitHub Actions, or GitLab CI for integrating test automation into release pipelines. Strong skills in debugging, log analysis, and issue triaging across distributed systems. Excellent communication skills with the ability to collaborate cross-functionally and lead QA efforts within agile teams. Education and Professional Certifications Bachelor’s degree in computer science and engineering preferred, other Engineering field is considered; Master’s degree and 6+ years’ experience Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 2 weeks ago
2.0 years
1 - 10 Lacs
Hyderābād
On-site
JOB DESCRIPTION You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Corporate Technology , you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2+ years applied experience Demonstrable ability to code in Java Experience with the Spring framework (Core, REST API, web services, messaging), with experience in Spring Boot and microservice architecture. Experience with data storage solutions, including SQL, NoSQL databases (Cassandra ), data lakes, and S3. Familiarity with logging and monitoring tools such as Kibana, Splunk, Elastic Search, Dynatrace, AppDynamics, Grafana, CloudWatch, and Datadog. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Excellent communication skills, with the ability to collaborate effectively with data scientists, engineers, and other stakeholders, and a willingness to stay updated with the latest trends in ML and MLOps. Preferred qualifications, capabilities, and skills Knowledge in Python and Scala Exposure to cloud technologies Relevant certifications in cloud platforms (e.g., AWS Machine Learning, DevOps, Certified Kubernetes Administrator/Developer). ABOUT US
Posted 2 weeks ago
2.0 years
1 - 10 Lacs
Hyderābād
On-site
You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Corporate Technology , you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2+ years applied experience Demonstrable ability to code in Java Experience with the Spring framework (Core, REST API, web services, messaging), with experience in Spring Boot and microservice architecture. Experience with data storage solutions, including SQL, NoSQL databases (Cassandra ), data lakes, and S3. Familiarity with logging and monitoring tools such as Kibana, Splunk, Elastic Search, Dynatrace, AppDynamics, Grafana, CloudWatch, and Datadog. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Excellent communication skills, with the ability to collaborate effectively with data scientists, engineers, and other stakeholders, and a willingness to stay updated with the latest trends in ML and MLOps. Preferred qualifications, capabilities, and skills Knowledge in Python and Scala Exposure to cloud technologies Relevant certifications in cloud platforms (e.g., AWS Machine Learning, DevOps, Certified Kubernetes Administrator/Developer).
Posted 2 weeks ago
3.0 years
1 - 10 Lacs
Hyderābād
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Corporate Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in PySpark, Data Warehouse, Data Engineering Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies ETL, AWS, Databricks
Posted 2 weeks ago
2.0 years
8 - 9 Lacs
Hyderābād
On-site
You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Corporate Technology, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Building pipelines in spark, tuning spark queries Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Stay up-to-date with the latest advancements in GenAI and LLM technologies and incorporate them into our data engineering practices. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Background with Machine Learning Frameworks and Big Data technologies such as Hadoop. Strong experience in programming languages such as Java or Python Python Machine Learning library and ecosystem experience ( Pandas and Numpy etc) Experience with Cloud technologies such as AWS or Azure. Experience working with databases such as Cassandra, MongoDB or Teradata Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Experience with Generative AI and Large Language Models, and experience integrating these technologies into data workflows Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies
Posted 2 weeks ago
3.0 years
1 - 10 Lacs
Hyderābād
On-site
JOB DESCRIPTION We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Corporate Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in PySpark, Data Warehouse, Data Engineering Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies ETL, AWS, Databricks ABOUT US
Posted 2 weeks ago
0.0 - 3.0 years
4 - 8 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-212785 LOCATION: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Apr. 28, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Basic Qualifications: Expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Proficient in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Expertise in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
Gurgaon
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date What your main responsibilities are: Accountabilities: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications: Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 years of relevant work experience is required. Experience with stakeholder management is an added advantage. What we are looking for Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills and Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Clarivate is a global leader in providing trusted insights and analytics to accelerate the pace of innovation. Our vision is to improve the way the world creates, protects, and advances innovation. To achieve this, we deliver critical data, information, workflow solutions and deep domain expertise to innovators everywhere. We are a trusted, indispensable global partner to our customers, including universities, non-profits, funding organizations, publishers, corporations, government organizations and law firms. Within the Content Technology branch of the business unit your role is to work with and support the production of high-quality trademark data that is being used for a variety of products and services. In collaboration with our experts, you will acquire profound knowledge of data structures and content. Main tasks will be data analysis, mappings as well as data testing in the context of bug fixing and quality assurance. This requires intense communication with internal and external partners alongside proper documentation. A very conscientious and accurate work style as well as reliability and a strong commitment as to quality are crucial. We are looking for a Solutions Analyst to join our Trademark Content team in Chennai/ Bangalore. This is an amazing opportunity to work on Product/Brand. The team consists of 10 Colleagues and is reporting to the Manager, Product Management. About You – Experience, Education, Skills, And Accomplishments Job/IT related education or certification or equivalent professional experience Proven experience in data analysis Ability to independently query relational and NoSQL databases Ability to design JSON records and schemas Ability to find patterns in large amounts of unstructured data using python or similar scripting language Adept at MS Excel(advanced) and other MS Office suite applications. Working knowledge of Windows and Unix file systems. Strong problem-solving skills, analytical thinking High attention to detail and accuracy, and a very systematic work style Eagerness to permanently improve content and data structure knowledge and technical skills, partly by means of self-education Knowledge of intellectual property, trademark filing process and usage of trademark information would be advantageous. Skills and experience are desirable in these fields: XML, DTDs, XML Schema, XSLT AWS Cloud Computing Unix (Linux) as well as common Unix-Tools like pattern matching, regular expressions Good English speaking and writing skills Proven ability to work both independently and as a member of a team It would be great if you also had . . . Knowledge of intellectual property, trademark filing process and usage of trademark information would be advantageous. XML, DTDs, XML Schema, XSLT What will you be doing in this role? Analysis of trademark data and underlying processes Data design, development and maintenance of XML schemas and DTDs Data analysis of large amounts of structured and unstructured data Querying relational and NoSQL (MongoDB) databases Data design, development and maintenance of JSON and JSON Schema. Assistance in authoring mapping specifications Standardization of data and persistent quality assurance and improvement Perform tests and development of a set of criteria /roadmap for routine tests to assess data accuracy/quality Active cooperation in designing, building-up and maintaining of a collection of representative sample documents Help with process analysis and to improve and optimize workflows Provide content support for diverse products Close collaboration with local colleagues as well as contributors at international locations Collaboration with software engineering and quality assurance teams for data delivery. Monitoring of trademark offices and data suppliers’ sites regarding data and content, relevant news and changes Hours of Work 8 hours per day (Full-time) Workdays: Monday to Friday At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Pinnacle Pinnacle Group is a premier staffing and workforce solutions provider, delivering innovative services to clients worldwide. Leveraging multiple enterprise-scale implementations, including Salesforce and Conga, we optimize legal and talent management processes to drive efficiency and client success, empowering our team to shape the future of staffing solutions. Summary Join Pinnacle Group’s Technology and Platform Services team to configure and optimize Salesforce CRM and Conga CLM platforms for our corporate operations teams. As a Salesforce Developer, you will administer, configure, and develop Salesforce for custom CRM business needs in sales, vendor management, and contract management by collaborating with stakeholders in and outside of the business. Responsibilities Configure Salesforce CRM with custom objects, fields, and security settings for multiple teams across the business. Configure Conga CLM on the Salesforce platform for contract creation, workflows, security, and reporting for legal operations. Build automations using Salesforce Flow to streamline CRM and CLM processes (e.g., contract approvals, sales outreach, compliance requirements). Write SOQL queries to support reporting and data validation. Manage user accounts, roles, permission sets, and sharing rules. Create and customize reports and dashboards for business insights. Collaborate with stakeholders to identify configuration needs and propose enhancements and create a comprehensive development plan. Requirements 3+ years of experience in Salesforce administration, including configuration, security, and workflows. 1+ years of experience with Conga CLM administration, including workflows and reporting. Proficiency in Salesforce Flow for building automations and process optimization. Experience with SOQL for querying data and building reports. Strong understanding of Salesforce security models (profiles, permission sets, sharing rules). Salesforce Certified Administrator required; Conga CLM Administrator or App Builder preferred. Excellent communication and stakeholder management skills for on-site collaboration. Bachelor’s degree in computer science, IT, or related field (or equivalent experience). Show more Show less
Posted 2 weeks ago
4.0 years
5 - 7 Lacs
Thiruvananthapuram
On-site
Experience: 4-7 years Job Summary The candidate should be proficient in .NET Core (Web API, Swagger, EF, ADO.NET), SQL Server (Functions, CTEs, indexing strategies, optimization, debugging complex SP and optimization), and Angular, with practical experience in Azure. This includes hands-on experience with pipeline setup, YAML and Bicep-based deployment, Dockerization, Azure Container Apps, Azure Event Grid, and querying application telemetry from both Application Insights and Log Analytics. Additionally, knowledge of NoSQL databases would be an added advantage. Good technical communication skills are essential, and the candidate should be capable of working independently with the client. Key Responsibilities Application Design: Participate in feature discussions, story planning, and the development of scalable, high-performance applications using .NET and Azure. Code Quality & Reviews: Write clean, maintainable, and well-documented code. Conduct and participate in peer code reviews to ensure adherence to coding standards, performance, and security practices. Stakeholder Collaboration: Work closely with product owners, business analysts, QA engineers, and other stakeholders to translate business requirements into technical specifications and deliver quality features. Legacy Modernization: Lead or support initiatives to migrate and modernize legacy applications to Azure cloud platforms, including planning, implementation, and optimization. Technology Stack Languages & Frameworks: C#, .NET Core (5.0+), Angular (12+) Frontend Technologies: HTML5, CSS3, JavaScript, TypeScript, Angular Databases: SQL Server (2019+), Entity Framework, MongoDB, Cosmos DB, OpenSearch/Elasticsearch Cloud & Infrastructure: Microsoft Azure: Azure Storage, Azure DevOps, Load Balancing, Monitoring, Containerization (Docker/Kubernetes) DevOps & CI/CD Tools: Git, Azure DevOps Pipelines, GitHub Actions Security: Single Sign-On (SSO), Identity Server, JWT Other Tools & Technologies: RabbitMQ, Redis, IIS, PowerShell, Swagger, SSIS Testing & QA: xUnit, NUnit, Selenium
Posted 2 weeks ago
6.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, DBT, GCP, Data Warehousing, SQL, Analytical Skills, Snowflake, Amazon Redshift, Affiliate Marketing Forbes Advisor is Looking for: Senior Data Warehouse Analyst Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled Senior Data Warehouse Analyst to play a critical role in the development, maintenance, and optimization of our enterprise data warehouse. As a Senior Analyst, you will be responsible for understanding business requirements, translating them into technical specifications, and working closely with data warehouse engineers to ensure the data warehouse effectively supports our reporting and analytical needs. You will also be involved in data quality initiatives, performance analysis, and providing insights to stakeholders. Responsibilities: Collaborate with business users, analysts, and other stakeholders to understand their data and reporting requirements. Translate business needs into clear and concise functional and technical specifications for data warehouse development. Analyze source system data to understand its structure, quality and potential for integration into the data warehouse. Work closely with developers to contribute to the design and implementation of logical and physical data models. Closely integrate with Quality Analysts to identify and troubleshoot data quality issues within the data warehouse. Assist in performance tuning efforts for SQL queries and data retrieval processes for business stakeholders. Participate in data quality initiatives, including defining data quality rules and monitoring data accuracy. Adhere to data governance/security policies and procedures. Assist in the creation and maintenance of data dictionaries and documentation. Effectively communicate technical concepts to both technical and non-technical audiences. Collaborate with data engineers, BI developers, and other team members to deliver data solutions. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Qualifications: Bachelor's degree in Computer Science, Information Systems, Business Analytics, or a related field. 6-8 years of data experience with 3-5 years of experience as a Data Warehouse Analyst or a similar role. Strong proficiency in SQL and experience querying large datasets across various database platforms (e.g., GCP, Snowflake, Redshift). Solid understanding of data warehousing concepts, principles, and methodologies (e.g., dimensional modeling, star schema). Good understanding on Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Experience working with ETL/ELT processes and tools. Basic DBT understanding. Hands-on experience with at least one major business intelligence and data visualization tool (e.g., Tableau, Power BI, Looker). Excellent analytical, problem-solving, and data interpretation skills. Strong communication (written and verbal) and presentation skills. Ability to work independently and as part of a collaborative team. Detail-oriented with a strong focus on data accuracy and quality. Experience with cloud-based data warehousing platforms (e.g., AWS, GCP). Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
2.0 years
5 - 10 Lacs
India
On-site
Key Responsibilities: Application Development: Design and develop enterprise applications using the Joget platform, ensuring robust, scalable, and user-friendly solutions. Customization: Customize Joget forms, workflows, plugins, and UI components to meet business requirements. Process Automation: Analyze and implement business process automation workflows, enhancing operational efficiency and reducing manual efforts. Integration: Integrate Joget applications with third-party systems, APIs, and enterprise tools to enable seamless data exchange. Performance Optimization: Optimize Joget applications for performance, scalability, and security. Collaboration: Work closely with business analysts, project managers, and other stakeholders to gather and refine requirements. Testing & Debugging: Conduct thorough testing, troubleshooting, and debugging to ensure application stability and quality. Documentation: Maintain comprehensive technical documentation for all development activities. Mentorship: Provide guidance and mentorship to junior developers as needed. Core Technical Skills: Joget Platform Expertise- Proficiency in Joget Workflow platform for designing and developing forms, workflows, data lists, and user views. Experience in creating and managing custom Joget plugins . Expertise in workflow automation and process configuration. Knowledge of Joget’s built-in components , templates, and modular features. Programming and Development- Strong knowledge of Java for back-end customizations and plugin development. Proficiency in JavaScript , HTML , and CSS for front-end customizations. Experience in SQL for database querying and management. Familiarity with XML and JSON for data handling. Integration and APIs- Hands-on experience integrating Joget applications with third-party systems using REST and SOAP APIs . Knowledge of OAuth , JWT , and other authentication mechanisms for secure integrations. Experience in handling data exchange between Joget and external systems. Database Management- Proficiency in relational databases such as MySQL , PostgreSQL , or Oracle . Experience in writing and optimizing complex SQL queries . Knowledge of database performance tuning and troubleshooting. Deployment and Infrastructure- Familiarity with cloud platforms like AWS, Azure, or Google Cloud for Joget deployment. Experience in Docker or other containerization tools for application hosting. Joget Deployment on Multiple Operating Systems and Databases Knowledge of CI/CD pipelines and deployment automation using tools like Jenkins or GitHub Actions. Debugging and Performance Optimization- Strong skills in troubleshooting Joget applications to identify and resolve issues. Experience in performance optimization of Joget workflows and UI components. Familiarity with Joget’s logging and monitoring tools for system analysis. Security- Understanding of application security best practices , including data encryption, role-based access control, and user authentication. Familiarity with secure coding practices and compliance standards. Job Type: Full-time Pay: ₹500,000.00 - ₹1,000,000.00 per year Benefits: Flexible schedule Health insurance Provident Fund Schedule: Day shift Supplemental Pay: Yearly bonus Ability to commute/relocate: Mohali district, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: joget: 2 years (Required) Work Location: In person
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.
The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager
Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts
As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2