Home
Jobs

1697 Querying Jobs - Page 38

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About the Digital Business Unit at IndusInd: The mandate of the Digital Business Unit at IndusInd Bank is as follows: Building customer centric products with human centered design principles for retail Individual and micro, small and medium enterprise (MSME) customer segments Build innovative products and propositions backed with problem solving mindset to discover and solve latent needs of customers Build Embedded Finance (Banking as a Service) applications Ensure designs are highly available, highly modular, highly scalable and highly secure Drive digital business Some of the applications managed by the Digital Business Unit include IndusMobile (bank’s mobile app for retail individual clients), Indusnet (net banking application of the bank), IndusMerchantSolutions app, Whatsapp Banking, Chatbots, easycredit for Individuals, easycredit for Business Owners, savings, current account and fixed deposit online platforms. There are many more innovative digital products and solutions in pipeline The unit’s objectives are three fold – (a) Drive better customer experience and engagement (b) transform existing lines of businesses and (c) build new digital only or banking as a service led digital business models About the role: You would be part of asset analytics and data science team and work on cutting edge problems for the bank. The individual will work closely with the stakeholders across risk, business, partnerships, digital and strategy in creating and refining strategies to augment profitability and growth for the bank. The incumbent will majorly be responsible with coming up data driven and actionable insights and presenting them to relevant stakeholders The candidate will work in close collaboration with digital product, growth, and marketing teams Overall, Job Description ·Experience querying databases and using statistical computer languages: R, Python, SLQ, etc. ·Use predictive modelling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes. ·Experienced in working with large and multiple datasets, data warehouses and ability to pull data using relevant programs and coding. ·Well versed with necessary data reprocessing and feature engineering skills. ·Strong background in Statistical Analysis. Constantly look and research on ML algorithms and data sources for better prediction ·Work and coordinate with multiple stakeholders to identify opportunities for leveraging company data to drive business solutions, implement models and monitor outcomes. ·Assess the effectiveness and accuracy of new data sources and data gathering techniques and develop processes and tools to monitor and analyze model performance and data accuracy. ·Experience in establishing/scaling up data science functions ·Proven ability to discover solutions hidden in large datasets and to drive business results with their data-based insights ·Leverage analytics to increase customer lifetime value for clients acquired digitally by pitching right product to the right client at the right time ·Help define pricing models for digital value propositions for various segments of users / clients to ensure profitability of the portfolio and to ensure achievement of business outcomes ·Work with product, growth, and marketing teams across product/campaign lifecycle ·Empower product and marketing teams by creating automated dashboards and reports using PowerBI Skills/Capabilities ·Candidate should be from Tier1/Tier2 institute ·5-7 years of relevant experience in Data Science in Banking/ NBFC/ Fintecth ·Model development experience in R, Python, SAS ·Strong and in-depth understanding of statistics ·Strong strategic thought leadership and problem-solving skills with ability to tackle unstructured and complex business problems ·Ability to build & use relationships and influence broadly across the organization ·Results driven with strong project management skills, ability to work on multiple priorities ·Handling Big Data, Segmentation, Analytics, Machine Learning, Artificial Intelligence, Statistics and Hypothesis Testing Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description: We are seeking an experienced Developer with expertise in ETL, AWS Glue and Data Engineer experience, combined with strong skills in Java and SQL. The ideal candidate will have 5-8 years of experience designing, developing, and implementing ETL processes and data integration solutions. Responsibilities include developing ETL pipelines using AWS Glue, managing data workflows with Informatica, and writing complex SQL queries. Strong problem-solving abilities and experience with data warehousing are essential Key Skills: Proficiency in AWS Glue and Informatica ETL tools Strong Java programming skills Advanced SQL querying and optimization Experience with data integration and data warehousing Excellent problem-solving and analytical skills Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title : Data Quality & Automation Engineer. Job Type: Full-time, Contractor. About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary We are seeking a skilled and innovative Data Quality & Automation Engineer to join our dynamic team in Hyderabad. In this role, you will leverage your expertise to ensure the quality and reliability of our data processing systems, playing a crucial role in our commitment to excellence. We are looking for a candidate who possesses a keen eye for detail and a strong ability to communicate both verbally and in writing. Key Responsibilities Develop and execute automated test scripts using Python and Selenium to validate data processing systems. Perform rigorous data validation and ensure data integrity across various data platforms. Collaborate with data engineers and developers to identify and troubleshoot issues. Maintain and enhance existing automation frameworks and scripts. Utilize SQL for advanced data querying and validation tasks. Implement and manage workflows using Apache Airflow. Work with Databricks to test data pipelines and transformations. Required Skills and Qualifications Proven experience in automation testing with a focus on data quality. Proficiency in Python programming and Selenium automation tools. Strong understanding of SQL for data validation and reporting. Experience with ALM. Knowledge of data warehousing and data lake architectures. Experience in leading and mentoring teams. Experience with data testing tools (dbt Test). Experience with Apache Airflow for workflow management. Familiarity with Databricks for data processing and analytics. Exceptional written and verbal communication skills. Attention to detail and a proactive approach to problem-solving. Preferred Qualifications Experience with cloud platforms (AWS, Azure) and big data technologies. Knowledge of continuous integration and deployment processes. Certification in data engineering or software testing is a plus. Show more Show less

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra

On-site

Indeed logo

About the Role: Grade Level (for internal use): 09 Job Description: Develop and execute test plans, test cases, and test scripts to ensure thorough coverage of software functionalities, including functional, regression, integration, and performance testing. Collaborate with cross-functional teams, including developers, product managers, and business analysts, to understand requirements, identify test scenarios, and ensure alignment with business objectives. Utilize strong technical expertise in automation tools and technologies to design, develop, and maintain automated test suites for continuous integration and deployment pipelines. Proficient in performing manual testing of command line application, web-based and API-based applications with focus on complex scenarios and edge cases, to ensure comprehensive test coverage. Analyse and troubleshoot issues, defects, and discrepancies, documenting and tracking them to resolution using issue tracking systems. Provide technical support and guidance to stakeholders regarding QA processes, tools, and methodologies. Stay updated on industry trends like AI and emerging technologies and incorporating relevant knowledge into QA practices. Develop and maintain SQL queries for data validation and verification. Qualifications: Bachelor’s degree in computer science, Engineering, or related field. Strong knowledge of SDLC and STLC. Strong technical proficiency in automation tools and technologies such as Java, Selenium, JUnit, TestNG, Cucumber etc. In-depth knowledge of SQL for data manipulation, querying, and validation. Experience with API-based testing tools such as Postman and Bruno. Experience with Linux operating systems command-line tools. Experience with python programming language. Experience on AWS services and AWS console. Excellent analytical and problem-solving skills, with a keen attention to detail. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams. Experience in supporting test strategy activities, particularly in the integration of multiple applications and systems. Demonstrated skill to proactively resolve issues and escalate appropriately. Experience testing web-based and API-based systems for user experience issues. Experience with Agile methodologies and CI/CD pipelines. Experience with Financial Domain is preferred. Experience with Index/Benchmarks or Asset Management or Portfolio Investment modelling. About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. We’re the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500 ® and the Dow Jones Industrial Average ® . More assets are invested in products based upon our indices than any other index provider in the world. With over USD 7.4 trillion in passively managed assets linked to our indices and over USD 11.3 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/spdji . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316146 Posted On: 2025-06-02 Location: Mumbai, Maharashtra, India

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

Job Information Date Opened 06/02/2025 Industry IT Services Job Type Contract Work Experience 4 Years City Pune City State/Province Maharashtra Country India Zip/Postal Code 411001 About Us CCTech 's mission is to transform human life by the democratization of technology. We are a well established digital transformation company building the applications in the areas of CAD, CFD, Artificial Intelligence, Machine Learning, 3D Webapps, Augmented Reality, Digital Twin, and other enterprise applications. We have two business divisions: product and consulting. simulationHub is our flagship product and the manifestation of our vision. Currently, thousands of users use our CFD app in their upfront design process. Our consulting division, with its partners such as Autodesk Forge, AWS and Azure, is helping the world's leading engineering organizations, many of which are Fortune 500 list of companies, in achieving digital supremacy. Job Description We are looking for a skilled Power BI Developer / Analyst to join our team. The ideal candidate will be responsible for designing, developing, and maintaining interactive dashboards and reports using Power BI to support data-driven decision-making. They should have expertise in data visualization, DAX, Power Query, and integrating Power BI with various data sources. Responsibilities: Design and develop Power BI dashboards, reports, and data models based on business needs. Connect Power BI to multiple data sources, including SQL databases, Excel, APIs, and cloud services (Azure, AWS, etc.). Write DAX queries and calculations for optimized data analysis. Use Power Query (M Language) for data transformation and ETL processes. Ensure data integrity, accuracy, and security in all reports and dashboards. Optimize Power BI performance, including data loading and visualization speed. Work closely with stakeholders to gather business requirements and translate them into actionable insights. Implement role-based security (RLS) and manage Power BI service permissions. Automate report refresh schedules and integrate Power BI with Power Automate for workflow automation. Stay updated on Power BI best practices, updates, and new features. Requirements Educational Background: BE/BTech in CS, IT, Mechanical, Civil or a related field. 4+ years of hands-on experience with Power BI Desktop, Power BI Service, and Power BI Report Server. Knowledge of DAX (Data Analysis Expressions) and Power Query (M Language). Experience in data modeling, data transformation, and ETL processes. Proficiency in SQL for querying and database management. Familiarity with Excel (Power Pivot, Power Query, VBA is a plus). Strong analytical and problem-solving skills. Must Have : 4+ year of PowerBI experience. Benefits Flexible work hours. Growth opportunities while working with a diversified workforce.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Telangana

On-site

Indeed logo

1)Bachelor’s degree 2)12-24 months of work experience. 3)Good communication skills - Trans Ops Representative will be facilitating flow of information between external 4)Proficiency in Excel (pivot tables, vlookups) 5)Demonstrated ability to work in a team in a very dynamic environment NOC Overview NOC (Network Operation Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving NOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, NOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon’s ability to serve its customers on time. Purview of a Trans Ops Representative: A Trans Ops Representative at NOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at NOC works across two verticals – Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Representative provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Key job responsibilities Trans Ops Representative should be able to ideate process improvements and should have the zeal to drive them to conclusion. Responsibilities include, but are not limited to: Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. Graduate with Bachelor’s degree Good logical skills Good communication skills - Trans Ops Representative will be facilitating flow of information between different teams Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra

On-site

Indeed logo

SENIOR QUALITY ENGINEER Mumbai, India Information Technology 316146 Job Description About The Role: Grade Level (for internal use): 09 Job Description: Develop and execute test plans, test cases, and test scripts to ensure thorough coverage of software functionalities, including functional, regression, integration, and performance testing. Collaborate with cross-functional teams, including developers, product managers, and business analysts, to understand requirements, identify test scenarios, and ensure alignment with business objectives. Utilize strong technical expertise in automation tools and technologies to design, develop, and maintain automated test suites for continuous integration and deployment pipelines. Proficient in performing manual testing of command line application, web-based and API-based applications with focus on complex scenarios and edge cases, to ensure comprehensive test coverage. Analyse and troubleshoot issues, defects, and discrepancies, documenting and tracking them to resolution using issue tracking systems. Provide technical support and guidance to stakeholders regarding QA processes, tools, and methodologies. Stay updated on industry trends like AI and emerging technologies and incorporating relevant knowledge into QA practices. Develop and maintain SQL queries for data validation and verification. Qualifications: Bachelor’s degree in computer science, Engineering, or related field. Strong knowledge of SDLC and STLC. Strong technical proficiency in automation tools and technologies such as Java, Selenium, JUnit, TestNG, Cucumber etc. In-depth knowledge of SQL for data manipulation, querying, and validation. Experience with API-based testing tools such as Postman and Bruno. Experience with Linux operating systems command-line tools. Experience with python programming language. Experience on AWS services and AWS console. Excellent analytical and problem-solving skills, with a keen attention to detail. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams. Experience in supporting test strategy activities, particularly in the integration of multiple applications and systems. Demonstrated skill to proactively resolve issues and escalate appropriately. Experience testing web-based and API-based systems for user experience issues. Experience with Agile methodologies and CI/CD pipelines. Experience with Financial Domain is preferred. Experience with Index/Benchmarks or Asset Management or Portfolio Investment modelling. About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. We’re the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500® and the Dow Jones Industrial Average®. More assets are invested in products based upon our indices than any other index provider in the world. With over USD 7.4 trillion in passively managed assets linked to our indices and over USD 11.3 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/spdji. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316146 Posted On: 2025-06-02 Location: Mumbai, Maharashtra, India

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

- 2+ years of data scientist experience - 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience - Experience applying theoretical models in an applied environment - Knowledge of relevant statistical measures such as confidence intervals, significance of error measurements, development and evaluation data sets, etc. We are seeking an exceptional Data Scientist to join a team of experts in the field of machine learning, and work together to tackle challenging problems across diverse compliance domains. We leverage risk models (including boosted trees and graph neural networks) as well as vision and large-language-models (LLMs) to detect illegal and unsafe products across the Amazon catalog. We work on machine learning problems for multi-modal classification, intent detection, information retrieval, anomaly and fraud detection, and generative AI. This is an exciting and challenging position to deliver scientific innovations into production systems at Amazon-scale to make immediate, meaningful customer impacts while also pursuing ambitious, long-term research. You will work in a highly collaborative environment where you can analyze and process large amounts of image, text and tabular data. You will work on hard science problems that have not been solved before, conduct rapid prototyping to validate your hypothesis, and deploy your algorithmic ideas at scale. There will be something new to learn every day as we work in an environment with rapidly evolving regulations and adversarial actors looking to outwit your best ideas. Key job responsibilities • Explore and evaluate state-of-the-art algorithms and approaches in risk modeling and vision/language models • Translate product and CX requirements into measurable science problems and metrics. • Collaborate with product and tech partners and customers to validate hypothesis, drive adoption, and increase business impact • Evaluate model performance in production and refresh/implement necessary updates to maintain optimal system performance. A day in the life - Understanding customer problems, project timelines, and team/project mechanisms - Proposing science formulations and brainstorming ideas with team to solve business problems - Writing code, and running experiments with re-usable science libraries - Reviewing labels and audit results with investigators and operations associates - Sharing science results with science, product and tech partners and customers - Contributing to team retrospectives for continuous improvements - Participating in science research collaborations and attending study groups with scientists across Amazon About the team We are a team of scientists building AI/ML solutions to make Amazon Earth’s most trusted shopping destination for safe and compliant products. Experience in Python, Perl, or another scripting language Experience in a ML or data scientist role with a large technology company Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

You possess experience in end-to-end Development and / or Implementation and / or Support activities covering expertise areas such as design of customizations, Coding and Unit testing, completing Test cycle rounds including End of Days, migrations, and Integrations for Oracle FLEXCUBE / Core banking products You possess knowledge and skills in software programming in Core Java, J2EE, Microservices related technologies, Spring Boot, Rest API, JavaScript, XML. Experience in Oracle SQL, PL/SQL and Oracle Database (18c or 12c) is a plus. Experience of 6 to 12 years Should hold a bachelor’s degree in computer science or equivalent degree You have solid understanding of Release management and Source control tools You should be able to perform Issue Tracking on Application and follow-up for resolution of same with collaborators You possess good client interaction skills in areas including presentation of solutions You have exposure to software deployment and fixing on Application Server software especially Oracle Weblogic You have exposure to analysis of Oracle Database AWR/ADDM reports and fixing of database performance issues. You have awareness of banking terminologies and concepts You possess IT skills including Microsoft Office, Basic SQL querying Should have superb communication and presentation skills and can willing to go that extra mile to attain precision Effective verbal and written communication skills. Proactive, willing to take ownership, ability to quickly learn new technologies and take up new tasks and initiatives Should have excellent problem solving, analytical and technical fixing skills Should be willing to work at offshore as well as travel to client locations Should be willing to take up FLEXCUBE Technical certifications in functional areas as and when required. Ability to work in a high pressure, fast moving and exciting environment You have exposure to Banking Domain You have exposure to software development processes and practices, DevOps tools, Testing tools You are aware of newest technologies in Banking Show more Show less

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

You possess experience in end-to-end Development and / or Implementation and / or Support activities covering expertise areas such as design of customizations, Coding and Unit testing, completing Test cycle rounds including End of Days, migrations, and Integrations for Oracle FLEXCUBE / Core banking products You possess knowledge and skills in software programming in Core Java, J2EE, Microservices related technologies, Spring Boot, Rest API, JavaScript, XML. Experience in Oracle SQL, PL/SQL and Oracle Database (18c or 12c) is a plus. Experience of 6 to 12 years Should hold a bachelor’s degree in computer science or equivalent degree You have solid understanding of Release management and Source control tools You should be able to perform Issue Tracking on Application and follow-up for resolution of same with collaborators You possess good client interaction skills in areas including presentation of solutions You have exposure to software deployment and fixing on Application Server software especially Oracle Weblogic You have exposure to analysis of Oracle Database AWR/ADDM reports and fixing of database performance issues. You have awareness of banking terminologies and concepts You possess IT skills including Microsoft Office, Basic SQL querying Should have superb communication and presentation skills and can willing to go that extra mile to attain precision Effective verbal and written communication skills. Proactive, willing to take ownership, ability to quickly learn new technologies and take up new tasks and initiatives Should have excellent problem solving, analytical and technical fixing skills Should be willing to work at offshore as well as travel to client locations Should be willing to take up FLEXCUBE Technical certifications in functional areas as and when required. Ability to work in a high pressure, fast moving and exciting environment You have exposure to Banking Domain You have exposure to software development processes and practices, DevOps tools, Testing tools You are aware of newest technologies in Banking Show more Show less

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

You possess experience in end-to-end Development and / or Implementation and / or Support activities covering expertise areas such as design of customizations, Coding and Unit testing, completing Test cycle rounds including End of Days, migrations, and Integrations for Oracle FLEXCUBE / Core banking products You possess knowledge and skills in software programming in Core Java, J2EE, Microservices related technologies, Spring Boot, Rest API, JavaScript, XML. Experience in Oracle SQL, PL/SQL and Oracle Database (18c or 12c) is a plus. Experience of 6 to 12 years Should hold a bachelor’s degree in computer science or equivalent degree You have solid understanding of Release management and Source control tools You should be able to perform Issue Tracking on Application and follow-up for resolution of same with collaborators You possess good client interaction skills in areas including presentation of solutions You have exposure to software deployment and fixing on Application Server software especially Oracle Weblogic You have exposure to analysis of Oracle Database AWR/ADDM reports and fixing of database performance issues. You have awareness of banking terminologies and concepts You possess IT skills including Microsoft Office, Basic SQL querying Should have superb communication and presentation skills and can willing to go that extra mile to attain precision Effective verbal and written communication skills. Proactive, willing to take ownership, ability to quickly learn new technologies and take up new tasks and initiatives Should have excellent problem solving, analytical and technical fixing skills Should be willing to work at offshore as well as travel to client locations Should be willing to take up FLEXCUBE Technical certifications in functional areas as and when required. Ability to work in a high pressure, fast moving and exciting environment You have exposure to Banking Domain You have exposure to software development processes and practices, DevOps tools, Testing tools You are aware of newest technologies in Banking Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

In the age of digital transformation, data has become increasingly vital to core business operations. But with so many cloud applications and platforms available today, data has become more decentralized than ever. CData is the real-time data connectivity company. Our easy-to-use integration products allow users to work with their data where, when, and how they need it. With a robust library of real-time data connectors, users can access data from hundreds of applications, tools, and systems – on-premises or in the cloud. CData is a global company, headquartered in Chapel Hill, NC with about 350 team members worldwide. More than 10,000 organizations rely on CData technologies to overcome data fragmentation challenges and unlock value from diverse, dispersed data assets. This position will join our India team, operating out of our Bangalore office where we have nearly 100 team members. Note: Immediate joiners and candidates who can join within a month of offer only need to apply/will be considered. Technical Support Engineer II CData’s ODBC/JDBC/ADO.NET drivers are used in enterprise data applications for Data Integration, Master Data Management (MDM), Low-Code Development, Business Intelligence, and Artificial Intelligence. With 250+ drivers and connector for Databases, file system, and SAAS application APIs. We are seeking an experienced Technical Support Engineer II engineer to help assist with technical issues through email, chat support, and on the phone. This position requires exceptional problem-solving skills as you will be tackling new and unique challenges every day with no set script to resolve an issue. The ability to research both alone and in a team to reproduce and debug customer issues is essential. As being the first point of contact of the company, applicants should have excellent communication skills as well as being able to quickly learn new topics. Programming knowledge is required as well as a bachelor's degree in a STEM field (Computer Science, Engineering, etc.). Key Duties Responsibilities Responsibilities include but are not limited to: Diagnose and troubleshoot technical issues, including account setup and network configuration. Ask customers targeted questions to quickly understand the root of the problem. Track computer system issues through to resolution, within agreed time limits. Talk clients through a series of actions, either via phone, email, or chat, until they have solved a technical issue. Escalate ongoing support issues to higher management in the Technical Support or Engineering Department. Refer to internal or external resources to provide accurate tech solutions. Ensure all issues are properly logged. Prioritize and manage several open issues at one time. Prepare accurate and timely reports. Document technical knowledge in the form of notes and manuals. Perform basic code analysis or setup debug projects to troubleshoot low-level coding issues To understand customer requirements and design a mapping flow accordingly. Requirements - Knowledge, Skills, & Experience Excellent communication skills (both verbal and written). Exceptional problem-solving skills. Ability to read and write code to solve basic problems in at least 1 language. Ability to work independently and in a team. Experience with C# or Java or Python preferred but any other previous programming experience is a plus. A good understanding of relational databases. Strong Knowledge of SQL concepts such as JOIN, aggregation, etc. Having experience in any Linux environment, Cloud infrastructure, DevOps is a plus. Have 2 to 3 years of experience in a client-facing (Technical) role. Knowledge about data models and querying. Able to prioritize projects/tasks based on the SLAs and Client Requirements. Solid understanding of APIs. Important Notice: The CData recruitment team does not use SMS or WhatsApp to communicate with job applicants. If you receive a message from these platforms claiming to be from CData regarding a job opportunity, please be aware that it is a scam. All current open positions are listed on our CData Careers page. Please use the Apply Now link or apply via LinkedIn. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Lead Software Engineer at JPMorgan Chase within the Liquidity and Account Solutions team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Works closely with Solution & Technical Architects, Senior Engineers to develop the best technical design and approach for new product development. Manages daily activities of the development team with scrum & agile approach and instill best practices for software development and documentation, assure designs meet requirements, and deliver high-quality work on tight schedules. Assesses compliance, risks, and vulnerabilities to ensure all systems and baselines are operationally sound, performance at scale, and exceed customer expectations. Provides technical guidance to coders and infra systems and process optimization opportunities. Drives architectural reviews, code reviews and business demos. Manages the ongoing development of the team, including recruitment, performance management, coaching, and mentoring. Collaborates with peer teams on complex, global engineering efforts to ensure architecture agreement, resource coordination, and implementation timelines. Translates business requirements into technical solutions, recommend alternative technical and business approaches, and lead engineering efforts to meet ambitious timelines with optimal solutions. Ensures proper communication concerning changes in established milestones or challenges that may affect the outcome of a project's completion date. Provides out of hours application support and coordination of Production releases. Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 5+ years applied experience Deep understanding of architectural concepts, issues and trends. Demonstrable experience in people management as well as strong written and verbal communication skills. Proficient in Java17+ with Spring Boot, Kafka, Kubernetes, SQL and NoSQL databases (e.g. Oracle, PostgreSQL, CockroachDB and Cassandra) Experience of creating thread-safe concurrent code in Java or another JVM based language Expertise in applying appropriate data structures and algorithms to solve business and technical problems Hands-on practical experience in secure system design, application development, automated regression testing, performance profiling and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Solid understanding of agile methodologies such as Scrum, CI/CD, Application Resiliency, and Security Expertise in application, data, and infrastructure architecture disciplines Ability to communicate effectively with senior management and other departments. Effectively organize and manage multiple organizational initiatives and encourage coworkers to do the same. Preferred Qualifications, Capabilities, And Skills Hands on experience with a statically compiled language like C, C++, Rust or Golang Knowledge of data serialisation formats (e.g. Google Protocol Buffers, Apache Avro or Parquet) Experience with gRPC and caching technologies, e.g. Redis, Valkey Experience in performance / non-functional testing tools and techniques (e.g. JMeter, Gatling, Blazemeter) Certified Kubernetes knowledge (e.g. CKAD) & Certified public cloud technology knowledge (e.g. AWS) ABOUT US Show more Show less

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are seeking an experienced Lead Data Engineer to join our dynamic team. As a Lead Data Engineer, you will be responsible for designing, developing, and maintaining data integration solutions for our clients. You will lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions. This is an exciting opportunity for a seasoned data integration professional passionate about technology and who thrives in a fast-paced, dynamic environment. Responsibilities Design, develop, and maintain data integration solutions for clients Lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions Collaborate with cross-functional teams to understand business requirements and design data integration solutions that meet those requirements Ensure data integration solutions are secure, reliable, and performant Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Continuously learn and stay up-to-date with the latest data integration approaches and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for querying and manipulating data Experience with Snowflake for cloud data warehousing Experience with at least one cloud platform such as AWS, Azure, or GCP Experience leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Our organization is seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects related to data integration and ETL on cloud-based platforms. You will take charge of creating and executing sophisticated data solutions, ensuring data accuracy, dependability, and accessibility. Responsibilities Create and execute sophisticated data solutions on cloud-based platforms Build ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and sustain documentation, including technical specifications, data flow diagrams, and data mappings Enhance and tune data integration processes for optimal performance and efficiency, guaranteeing data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Qualifications in Snowflake for data warehousing Familiarity with cloud platforms like AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description We are looking for Technical Consultant - Adobe Journey Optimizer, whose role is to help businesses design, implement, and optimize personalized customer journeys using Adobe Journey Optimizer - a powerful tool in the Adobe Experience Platform (AEP) suite. Immediate joiners preferred. This is a Hybrid working model. Responsibilities Provide advisory, consulting, and technical implementation services to customers on Adobe Experience Platform and Adobe Journey Optimizer. Assess customer requirements, pain points, goals and make recommendations on solution architecture, design and roadmap. Configure Adobe Experience Platform services like Identity Service, Data Storage Layer, Profile dataset, Real-time Customer Profile etc. Implement data ingestion from various sources like CRM, MPNs, Website data, Mobile apps using APIs, Experience Data Model (XDM), schemas and mappers. Create customer, visitor and product audiences using segments and AI/ML powered segments Configure destinations to activate audiences in channels like email, Journey Optimizer, Ad Cloud and CRM systems. Implement Journey Optimizer features like journey orchestration, triggers, automations, actions, messages and offers. Develop custom applications, workflows and APIs to extend the platform as per customer needs. Troubleshoot technical issues, debug and optimize performance of customer implementations. Provide post go-live support, enhancements, maintenance and upgrades for customer solutions. Conduct product training, knowledge transfer and share best practices. Continuously track product updates and improve solution recommendations for evolving customer needs. Skills And Experience Required 3-4+ years of extensive implementation experience with Adobe Experience Cloud products especially Adobe Experience Platform and Journey Optimizer. Expertise in deploying, configuring and optimizing all major Experience Platform services and Journey Optimizer features. Strong SQL skills for querying datasets, implementing data transformations, cleansing data, etc. Hands-on experience developing custom applications, workflows and integrations using Experience Platform APIs. Deep familiarity with Adobe Experience Platform and Journey Optimizer technical implementations including: Setting up source connectors and ingesting data using the API and UI. Configuring Experience Events (XDM schemas) for capturing data from sources. Creating audience segments using custom and AI/ML-based segments. Triggering journeys and activations based on segments and profiles. Implementing journey automations, actions and messages o Integrating with destinations like CRM, email, etc. Hands-on experience developing using the Platform APIs and SDKs in a language like: JavaScript for API calls. Java for extending functionality with custom connectors or applications. Expertise in data modelling for multi-channel needs using the Experience Data Model (XDM). Familiarity with configuring IMS authentication to connect external systems to Platform APIs. Experience developing, debugging and optimizing custom server-side applications. Proficiency in JavaScript, JSON, REST APIs and SQL. Expertise ingesting data from sources like Adobe Analytics, CRM, Ad Server, Web and Mobile. Strong understanding of concepts like multi-channel data management, segmentation, orchestration, automation and personalization. Understanding of data structures like: JSON for representing data in APIs, Data Tables for processing tabular data, Streaming data flows. Experience automating technical tasks using tools like: API integration tests, Postman for API testing, Git for source control. Excellent documentation and communication skills with the ability to clearly present technical recommendations to customers. Experience troubleshooting complex technical issues related to APIs, data ingestion, segments, destinations etc. Self-starter with the ability to manage resource constraints, changing priorities and tight deadlines. Problem solving mindset with the ability to constantly improve and optimize deployed solutions. Show more Show less

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

We are looking for a skilled Lead Data Engineer to become an integral part of our vibrant team. In this role, you will take charge of designing, developing, and maintaining data integration solutions tailored to our clients' needs. You will oversee a team of engineers, ensuring the delivery of high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for a seasoned data integration expert who is passionate about technology and excels in a fast-paced, dynamic setting. Responsibilities Design, develop, and maintain client-specific data integration solutions Oversee a team of engineers to guarantee high-quality, scalable, and efficient delivery of data integration solutions Work with cross-functional teams to comprehend business requirements and create suitable data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, including technical specifications, data flow diagrams, and data mappings Stay informed and up-to-date with the latest data integration methods and tools Requirements Bachelor’s degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or related fields Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Why join us? Our purpose is to design for the good of humankind. It’s the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone. Role: Security Analyst Location: Bangalore Purpose / Profile As a Security Analyst at Miller Knoll, you will help reduce enterprise risk by safeguarding the organization’s digital assets from cyber threats. You will work closely with the Security Operations Center to continuously monitor, analyze, and respond to security alerts and events. You will collaborate directly with the greater Information Security team to ensure compliance with industry regulations, standards, and best practices, as well as educate employees on proper cyber hygiene. You will help guarantee the confidentiality, integrity, and availability of the organization’s network and compute resources and aid in shaping strategies to reduce cyber risk. Essential Functions Provide timely detection and identification of possible attacks/intrusions and distinguish findings from benign activities. Correlate incident data to identify specific vulnerabilities and make recommendations that enable prompt containment and remediation. Coordinate with the greater organization to resolve cyber incidents. Provide technical summaries of findings in accordance with established reporting procedures. Escalate and triage incidents that may cause an immediate impact to the organization. Perform analysis of log files from a variety of sources (e.g., individual host logs, network traffic logs, firewall logs, and intrusion detection system logs) to identify possible threats. Perform event correlation to gain situational awareness and to determine the effectiveness of an observed attack. Assist in the development and implementation of security policies and procedures. Track and document cyber incidents from initial detection through final resolution. Assist in reducing risk by actively identify areas of non-compliance and making recommendations for improvement. This role will work either in the UK shift (12 noon to – 9 pm) and/or US shift (5:30PM to 2:30 AM). Additional Functions Stay current with cybersecurity news and trends relevant to the business and industry. Participate in the information security on-call rotation, providing emergency support for security-related incidents. Provide input into the development of security policies and procedures. Interface with other business units such as Governance, Risk, and Compliance to communicate program status and overall security posture. Promote a positive security culture through knowledge sharing, influences, and conduct. Create and maintain role-specific documentation. Participate in the Change Advisory Board (CAB). Knowledge, Skills, And Abilities Knowledge of system administration concepts for operating systems such as Unix/Linux, IOS, Android, and Windows operating systems. Knowledge of cloud service models and cloud security best practices. Knowledge of procedures used for documenting and querying reported incidents, problems, and events. Knowledge of Intrusion Detection System (IDS)/Intrusion Prevention System (IPS) tools and applications. Knowledge of auditing and logging procedures (including server-based logging). Knowledge of common software applications and their associated vulnerabilities. Knowledge of host-based security products and how they reduce exploitation. Knowledge of approach, strategy, and structure of exploitation tools (e.g., sniffers, keyloggers) and techniques (e.g., gaining backdoor access, collecting/exfiltrating data, conducting vulnerability analysis). Knowledge of MITRE ATT&CK and similar cybersecurity frameworks. Knowledge of what constitutes a “threat” to a network. Skill of identifying, capturing, containing, and reporting malware. Skill in using incident handling methodologies. Skill in using security event correlation tools. Skill in developing analytic approaches to problems and situations for which information is incomplete or where no precedent exists. Ability to identify unusual activity amongst a defined baseline. Qualifications Education/Experience Bachelor’s in computer science, Information Systems, Cybersecurity, or Software Engineering. 6-8 years of relevant experience in cybersecurity or information technology. 3+ years of hands-on experience with an EDR/XDR solution, SEG, and SIEM. Experienced in a scripting language such as Python, PowerShell, or VBA. Licenses and Certifications One or more technical or cybersecurity certification preferred (e.g., CISA, CCSP, CRISC, CEH, Security+, GSEC, SSCP) Who We Hire? Simply put, we hire everyone. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We’re committed to equal opportunity employment, including veterans and people with disabilities. MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_help@millerknoll.com. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Dadri, Uttar Pradesh, India

On-site

Linkedin logo

Job Description Flexsin is hiring Sr. Software Engineer - PHP. This is an immediate requirement and immediate joiners will be preferred, and this is work from office requirement. Responsibilities Optimize monolithic legacy codebases and newer repositories using PHP, Codeignitor, Laravel, Cakephp, wordpress Work in an agile environment with Project Managers in completing user stories Integrate and design highly scalable/transaction heavy API based applications such as Restful APIs and designing JSON payloads. Cooperate with QA, Application Support, and DevOps teams to triage bugs Coordinate with other Engineering Leads/Managers in the same or different functions Convert complex business rules and workflows from User Story requirements into bespoke Sales Force type CRM applications Work with large database schemas that would include adding and updating tables and rows using MySQL Write Unit tests and Documentation Qualification & Skills 3-5 years experience with OOP, SQL, Rest API. Proficient in PHP, Laravel, codeignitor, Cakephp, wordpress etc. Knowledge of database design and querying using MySQL. Proficiency in HTML, Bootstrap, JavaScript, JQuery Practical experience using the MVC architecture. The candidate should have experience working directly with a project-based or service-based company and be proficient in scalable architecture and application development. Experience with AWS services such as EC2, S3, RDS, etc. Proficient in version control using Git OR SVN. Problem-solving skills and critical mindset. Good communication skills. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

- 2+ years of data scientist experience - 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience - Experience applying theoretical models in an applied environment Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for capacity planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications and research modeling to optimize operation processes. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, scientists, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. You will also improve the efficiency of capital investment by helping the fulfillment centers to improve storage utilization and the effective use of automation. Finally, you will help create the metrics to quantify improvements to the fulfillment costs (e.g., transportation and labor costs) resulting from the application of these optimization models and tools. Major responsibilities include: · Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. · Apply Statistical and Machine Learning methods to specific business problems and data. · Create global standard metrics across regions and perform benchmark analysis. · Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. · Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. · Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. · Develop efficient data querying and modeling infrastructure. · Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. · Utilizing code (Python, R, Scala, etc.) for analyzing data and building statistical models. Experience in Python, Perl, or another scripting language Experience in a ML or data scientist role with a large technology company Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

This job is with Amazon, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Description IN Data Engineering & Analytics(IDEA) Team is looking to hire a rock star Data Engineer to build data pipelines and enable ML models for Amazon India businesses. IN Data Engineering & Analytics (IDEA) team is the central Data engineering and Analytics team for all A.in businesses. The team's charter includes 1) Providing Unified Data and Analytics Infrastructure (UDAI) for all A.in teams which includes central Petabyte-scale Redshift data warehouse, analytics infrastructure and frameworks for visualizing and automating generation of reports & insights and self-service data applications for ingesting, storing, discovering, processing & querying of the data 2) Providing business specific data solutions for various business streams like Payments, Finance, Consumer & Delivery Experience. The Data Engineer will play a key role in performing data extraction, data transformation, building and managing data pipelines to ensure data availability for ML & LLM models of IN businesses. The role sits in the heart of technology & business worlds and provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in working with SQL, Scripting (Python, typescript, javascript), databases, ML/LLM models, big data technologies such as Apache Spark (Pyspark, Spark SQL). An ideal candidate will be someone who is a self-starter that can start with a requirement & work backwards to conceive and devise best possible solution, a good communicator while driving customer interactions, a passionate learner of new technology when the need arises, a strong owner of every deliverable in the team, obsessed with customer delight, business impact and 'gets work done' in business time. Key job responsibilities Build end to end data extraction, data transformation and data pipelines to ensure data availability for ML & LLM models that are critical to IN businesses. Enable ML/LLM tools by setting up all the required underlying data infrastructure, data pipelines and permissions to generate training and inference data for the ML models. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, Scripting and Amazon/AWS big data technologies Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Drive operational excellence strongly and build automation and mechanisms to reduce operations Enjoy working closely with your peers in a group of very smart and talented engineers. A day in the life India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by providing UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dash-boarding d) empowering business with self-service tools to manage data and generate insights. Basic Qualifications 2+ years of data engineering experience Experience with SQL Experience with one or more scripting language (e.g., Python, KornShell) Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Knowledge of computer science fundamentals such as object-oriented design, operating systems, algorithms, data structures, and complexity analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Linkedin logo

Job Description Key Responsibilities : Application Development : Design and develop enterprise applications using the Joget platform, ensuring robust, scalable, and user-friendly solutions. Customization : Customize Joget forms, workflows, plugins, and UI components to meet business requirements. Process Automation : Analyze and implement business process automation workflows, enhancing operational efficiency and reducing manual efforts. Integration : Integrate Joget applications with third-party systems, APIs, and enterprise tools to enable seamless data exchange. Performance Optimization : Optimize Joget applications for performance, scalability, and security. Collaboration : Work closely with business analysts, project managers, and other stakeholders to gather and refine requirements. Testing & Debugging : Conduct thorough testing, troubleshooting, and debugging to ensure application stability and quality. Documentation : Maintain comprehensive technical documentation for all development : Experience : 3- 5 years of experience in Joget development (internship experience Technical Skills Platform Expertise : Proficiency in Joget Workflow platform for designing and developing forms, workflows, data lists, and user views. Experience in creating and managing custom Joget plugins. Expertise in workflow automation and process configuration. Knowledge of Jogets built-in components, templates, and modular and Development : Strong knowledge of Java for back-end customizations and plugin development. Proficiency in JavaScript, HTML, and CSS for front-end customizations. Experience in SQL for database querying and management. Familiarity with XML and JSON for data and APIs : Hands-on experience integrating Joget applications with third-party systems using REST and SOAP APIs. Knowledge of OAuth, JWT, and other authentication mechanisms for secure integrations. Experience in handling data exchange between Joget and external Management : Proficiency in relational databases such as MySQL, PostgreSQL, or Oracle. Experience in writing and optimizing complex SQL queries. Knowledge of database performance tuning and and Infrastructure : Familiarity with cloud platforms like AWS, Azure, or Google Cloud for Joget deployment. Experience in Docker or other containerization tools for application hosting. Joget Deployment on Multiple Operating Systems and Databases. Knowledge of CI/CD pipelines and deployment automation using tools like Jenkins or GitHub and Performance Optimization : Strong skills in troubleshooting Joget applications to identify and resolve issues. Experience in performance optimization of Joget workflows and UI components. Familiarity with Jogets logging and monitoring tools for system analysis. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

What You'll Do Data Analytics & Modeling : Apply strong Data Analytics and Analytical Skills to understand complex business requirements and translate them into effective Data Modeling solutions. Data Pipeline Development : Design, develop, and maintain robust ETL (Extract, Transform, Load) pipelines using Azure Data Engineering services to ingest, process, and transform large datasets. Data Warehousing : Build and optimize fact and dimension tables within analytical databases, contributing to scalable data warehousing solutions. Azure Data Engineering : Hands-on development using Azure Data Engineering tools such as Azure Data Factory (ADF), Databricks, and Fabric. Programming for Data : Utilize expertise in PySpark and Python to develop and maintain efficient data processing solutions, ensuring data integrity, performance, and scalability. BI Dashboarding : Develop and maintain compelling Business Intelligence (BI) dashboards using tools like PowerBI and/or Tableau, turning raw data into actionable insights. Analytical Databases : Work with analytical databases such as Snowflake, Azure Synapse, and others to store and process large volumes of data. SQL & Programming : Demonstrate proficiency in SQL for data manipulation and querying, and possess skills in other relevant programming languages. Performance & Integrity : Ensure data integrity, quality, and optimal performance of data pipelines and BI solutions. Collaboration : Collaborate effectively with data scientists, business analysts, product managers, and other engineering teams to understand data needs and deliver comprehensive Skills & Qualifications : Experience : Minimum 2 years of core, hands-on experience in Azure Data Engineering and Business Intelligence (PowerBI and/or Tableau). Data Fundamentals : Strong understanding of Data Analytics, Analytical Skills, Data Analysis, Data Management, and Data Modeling concepts. Azure Data Engineering : Mandatory hands-on experience with Azure Data Engineering services including ADF, Databricks, and Fabric. Programming for Data : Proficiency in PySpark and Python, with a proven ability to develop and maintain robust data processing solutions. SQL Expertise : Strong proficiency in SQL for complex data querying and manipulation. BI Tools : Practical experience building BI dashboards using PowerBI and/or Tableau. Analytical Databases : Experience with analytical databases like Snowflake, Azure Synapse, etc. Problem Solving : Strong problem-solving and critical thinking abilities to tackle complex data challenges. Education : Bachelor's degree in Computer Science, Information Systems, or a related Qualifications (Nice-to-Have) : Relevant Microsoft Azure certifications (e.g., Azure Data Engineer Associate, Azure Data Analyst Associate). Experience with real-time data streaming technologies (e.g., Kafka, Azure Event Hubs). Familiarity with Data Governance and Data Quality frameworks. Exposure to MLOps concepts and tools. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role : Data Engineer Location : Bengaluru, Karnataka, India. Type : Contract/ Freelance. About The Role We're looking for an experienced Data Engineer on Contract (4-8 years) to join our data team. You'll be key in building and maintaining our data systems on AWS. You'll use your strong skills in big data tools and cloud technology to help our analytics team get valuable insights from our data. You'll be in charge of the whole process of our data pipelines, making sure the data is good, reliable, and fast. What You'll Do Design and build efficient data pipelines using Spark / PySpark / Scala. Manage complex data processes with Airflow, creating and fixing any issues with the workflows (DAGs). Clean, transform, and prepare data for analysis. Use Python for data tasks, automation, and building tools. Work with AWS services like S3, Redshift, EMR, Glue, and Athena to manage our data infrastructure. Collaborate closely with the Analytics team to understand what data they need and provide solutions. Help develop and maintain our Node.js backend, using Typescript, for data services. Use YAML to manage the settings for our data tools. Set up and manage automated deployment processes (CI/CD) using GitHub Actions. Monitor and fix problems in our data pipelines to keep them running smoothly. Implement checks to ensure our data is accurate and consistent. Help design and build data warehouses and data lakes. Use SQL extensively to query and work with data in different systems. Work with streaming data using technologies like Kafka for real-time data processing. Stay updated on the latest data engineering technologies. Guide and mentor junior data engineers. Help create data management rules and procedures. What You'll Need Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 4-8 years of experience as a Data Engineer. Strong skills in Spark and Scala for handling large amounts of data. Good experience with Airflow for managing data workflows and understanding DAGs. Solid understanding of how to transform and prepare data. Strong programming skills in Python for data tasks and automation. Proven experience working with AWS cloud services (S3, Redshift, EMR, Glue, IAM, EC2, and Athena). Experience building data solutions for Analytics teams. Familiarity with Node.js for backend development. Experience with Typescript for backend development is a plus. Experience using YAML for configuration management. Hands-on experience with GitHub Actions for automated deployment (CI/CD). Good understanding of data warehousing concepts. Strong database skills OLAP/OLTP. Excellent command of SQL for data querying and manipulation. Experience with stream processing using Kafka or similar technologies. Excellent problem-solving, analytical, and communication skills. Ability to work well independently and as part of a team. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Key Responsibilities End-to-End Module Development : Take ownership of assigned modules, leading their development from conception to completion. Requirements Understanding & Implementation : Actively connect with functional counterparts to thoroughly understand requirements and translate them into robust and efficient technical implementations. Back-end Development : Design, develop, and maintain high-quality back-end components using C#, ASP.NET, Web API, and other related .NET technologies. Telerik Integration : Leverage expertise in Telerik Controls for developing rich and interactive user interfaces and ensuring seamless integration with back-end functionalities. Database Interaction : Work with SQL Server for database design, querying, optimization, and data manipulation, utilizing Entity Framework or NHibernate for ORM. Front-end Interaction : Collaborate on front-end development, demonstrating proficiency in HTML, CSS, jQuery, and JavaScript, especially in the context of Telerik controls and potential Blazor implementations. Cross-Platform Application Development : Contribute to the development and maintenance of processes for cross-platform applications. Code Quality & Best Practices : Write clean, maintainable, and well-documented code, adhering to established coding standards and architectural best practices. Troubleshooting : Identify, diagnose, and resolve technical issues and bugs within the application. Skills Required C# : Strong proficiency in C# programming. ASP.NET : Solid experience with ASP.NET development. Blazor : Hands-on experience with Blazor is highly desirable. HTML, CSS, jQuery, JavaScript : Proficient in fundamental web technologies. Telerik Controls : Mandatory hands-on experience with Telerik controls. Web API : Strong experience in building and consuming Web APIs and RESTful services. ORM : Experience with Entity Framework or NHibernate. SQL Server : Strong skills in SQL Server, including writing queries and database design. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

Exploring Querying Jobs in India

The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.

Career Path

In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager

Related Skills

Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts

Interview Questions

  • What is the difference between SQL and NoSQL databases? (basic)
  • Explain the purpose of the GROUP BY clause in SQL. (basic)
  • How do you optimize a slow-performing SQL query? (medium)
  • What are the different types of joins in SQL? (medium)
  • Can you explain the concept of ACID properties in database management? (medium)
  • Write a query to find the second-highest salary in a table. (advanced)
  • What is a subquery in SQL? Provide an example. (advanced)
  • Explain the difference between HAVING and WHERE clauses in SQL. (advanced)

Closing Remark

As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies