Home
Jobs

1697 Querying Jobs - Page 15

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

7 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

- 2+ years of software development, or 2+ years of technical support experience - Experience scripting in modern program languages - Experience troubleshooting and debugging technical systems Amazon WWR&R is comprised of business, product, operational, program, software engineering and data teams that manage the life of a returned or damaged product from a customer to the warehouse and on to its next best use. Our work is broad and deep: we train machine learning models to automate routing and find signals to optimize re-use; we invent new channels to give products a second life; we develop world-class product support to help customers love what they buy; we pilot smarter product evaluations; we work from the customer backward to find ways to make the return experience remarkably delightful and easy; and we do it all while scrutinizing our business with laser focus. WWR&R data engineering team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics data lake platform. As a member of this team, your mission will be to support massively scalable, distributed data warehousing, querying, reporting and decision-support system. We support a fast-paced environment where each day brings new challenges and opportunities. As a Support Engineer, you will play a pivotal role in ensuring the stability, compliance, and operational excellence of our enterprise Data Warehouse (DW) environment. In this role, you will be responsible for monitoring and maintaining production data pipelines, proactively identifying and resolving issues that impact data quality, availability, or timeliness. You’ll collaborate closely with data engineers and cross-functional teams to troubleshoot incidents, implement scalable solutions, and enhance the overall resilience of our data infrastructure. A key aspect of this role involves supporting our data compliance and governance initiatives, ensuring systems align with internal policies and external regulatory standards such as GDPR. You will help enforce access controls, manage data retention policies, and support audit readiness through strong logging and monitoring practices. You’ll also lead efforts to automate manual support processes, improving team efficiency and reducing operational risk. Additionally, you will be responsible for maintaining clear, up-to-date documentation and runbooks for operational procedures and issue resolution, promoting consistency and knowledge sharing across the team. We’re looking for a self-motivated, quick-learning team player with a strong sense of ownership and a ‘can-do’ attitude, someone who thrives in a dynamic, high-impact environment and is eager to make meaningful contributions to our data operations. Knowledge of web services, distributed systems, and web application development Experience troubleshooting & maintaining hardware & software RAID Experience with REST web services, XML, JSON Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

0 years

3 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Pune, Maharashtra, India . Minimum qualifications: Bachelor's degree in Computer Science or equivalent practical experience. Experience in architecting, developing, or maintaining secure cloud solutions. Experience with designing cloud enterprise solutions and supporting projects to completion. Experience with coding in one or more general purpose languages (e.g., Python, Java, Go, C or C++) including data structures, algorithms, and software design. Preferred qualifications: Experience in software development, managing Operating System (OS) or Linux environments, network design and deployment or storage systems. Experience with data migration and integration tools with the knowledge of data visualization tools and techniques. Experience in querying and managing relational and non-relational databases with data modeling and performance optimization. Experience with customer-facing migration including service discovery, assessment, planning, execution, and operations. Knowledge of data warehousing concepts with dimensional modeling, Extract, Transform, and Load (ETL) processes, and data governance. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Work with customer technical leads, client executives, and partners to manage and deliver implementations of cloud solutions and become a trusted advisor to decision makers throughout the engagement. Propose data solution architectures and manage the deployment of cloud data solutions according to customer requirements and implement best practices. Work with internal specialists, Product, and Engineering teams to package approaches, best practices, and lessons learned into thought leadership, methodologies, and published assets. Interact with Business, Partners, and customer technical stakeholders to manage project scope, priorities, deliverables, risks and issues, and timelines for successful client outcomes. Travel 30% of the time for client engagements. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

7.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Hyderabad, Telangana Job ID 30171359 Job Category General Management Country: India Location: Building No 12D, Floor 5, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India Role: Business Analyst Location: Hyderabad, India Full/ Part-time: Full-time Build a career with confidence Carrier is a leading provider of heating, ventilating, air conditioning and refrigeration systems, building controls and automation, and fire and security systems leading to safer, smarter, sustainable, and high-performance buildings. Carrier is on a mission to make modern life possible by delivering groundbreaking systems and services that help homes, buildings and shipping become safer, smarter, and more sustainable. Our teams exceed the expectations of our customers by anticipating industry trends, working tirelessly to master and revolutionize them. About the role Experienced General Finance Management professional, who implements financial plans, analyzes financial processes and standards, and establishes financial indicators to forecast performance measures. Develops relationships with external financial consultants and advisors and provides technical advice to functional managers on financial matters. Key Responsibilities: If you thrive in a fast-paced environment and are looking for an opportunity to develop your Analytics career in Shared Services, then we have a great opportunity for you. We are seeking a motivated Business Analyst to support the Global Business Services organization. Specific responsibilities for this position include: Manage end-to-end deployment of reporting structures, including data collection, transformation, visualization, and distribution, ensuring alignment with business needs. Manage implementations of business intelligence dashboards using BI tools, ensuring that data is presented in a meaningful and visually appealing manner. Collaborate with Global Process Owners from the Finance team to gather requirements, design KPI visualizations, and ensure data accuracy and quality. Deploy integrated reporting solutions, through MS tools such as Power Query and Power Automate workflows, to streamline data collection, processing, and dissemination. Collaborate with IT teams to establish new database connections, optimize SQL queries, and ensure smooth data integration from various sources. Conduct thorough data analysis, including forecast and projections, to identify trends, anomalies, and areas for process improvement. Provide recommendations to team leaders based on data insights, enabling informed decision-making and driving operational efficiencies. Support Continuous Improvement initiatives, including Kaizen events, by setting up performance measurement structures and tracking progress. Stay updated with emerging trends in business intelligence, data visualization, and project management to continually enhance reporting and analytical capabilities. EDUCATION / CERTIFICATIONS: Bachelor’s degree in finance or accounting required Requirements 7+ years of experience in Finance processes, preferably in a Shared Service environment Proven experience in an analytical position; proficiently using finance concepts in to deliver business findings to the stakeholders. Proven track record of successfully managing projects related to KPI definition, measurement, and deployment. Experience in designing and developing BI dashboards using tools like Power BI, Tableau, or similar platforms. Strong background in data integration, database management, and SQL querying for efficient data retrieval and analysis. Proficiency in process improvement methodologies, such as Lean or Six Sigma, and the ability to drive continuous improvement initiatives. Proven analytical and quantitative skills, ability to use data and metrics to set-up and find data trends Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Make yourself a priority with flexible schedules, parental leave and our holiday purchase scheme Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way . Join us and make a difference. Now! #cbsfinance Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 1 week ago

Apply

0 years

0 Lacs

Telangana

On-site

GlassDoor logo

1)Bachelor’s degree 2)12-24 months of work experience. 3)Good communication skills - Trans Ops Representative will be facilitating flow of information between external 4)Proficiency in Excel (pivot tables, vlookups) 5)Demonstrated ability to work in a team in a very dynamic environment Job Description for Transportation Representative – NOC NOC Overview NOC (Network Operation Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving NOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, NOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon’s ability to serve its customers on time. Purview of a Trans Ops Representative: A Trans Ops Representative at NOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at NOC works across two verticals – Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Representative provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Key job responsibilities Trans Ops Representative should be able to ideate process improvements and should have the zeal to drive them to conclusion. Responsibilities include, but are not limited to: Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. Graduate with Bachelor’s degree Good logical skills Good communication skills - Trans Ops Representative will be facilitating flow of information between different teams Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 - 0 Lacs

India

On-site

GlassDoor logo

Job Title: Supply Chain Applications Specialist Job Type: Full-Time Experience Level: Mid-Senior Level Job Summary: We are seeking an experienced Supply Chain Applications Analyst to support and enhance our ERP ecosystem, including Infor M3 and legacy platforms such as VAX/VMS and AS400. The ideal candidate should have a solid understanding of supply chain processes, ERP workflows, and integration across systems, with hands-on experience supporting and optimizing business applications in a manufacturing or distribution environment. Key Responsibilities: Provide application support for Infor M3 ERP, including configuration, troubleshooting, and functional enhancements. Maintain and support legacy systems based on VAX/VMS and AS400 platforms. Collaborate with business users to understand supply chain requirements and translate them into technical solutions. Monitor system performance and resolve issues to ensure continuous and efficient operations. Manage data migration and system integration efforts between Infor M3 and legacy systems. Document system configurations, support procedures, and user manuals. Participate in system upgrades, patch management, and testing efforts. Provide training and support to end-users across supply chain functions (procurement, inventory, logistics, etc.). Work closely with IT, vendors, and cross-functional teams for project execution and support. Required Qualifications: Bachelor’s degree in Computer Science, Information Systems, Supply Chain, or related field. 4–8 years of experience working with ERP systems (especially Infor M3) in a supply chain or manufacturing context. Strong knowledge of VAX/VMS, AS400 (iSeries) platforms. Experience in supporting supply chain modules such as procurement, inventory, warehouse, logistics, and order management. Strong analytical and problem-solving skills. Familiarity with scripting, report generation, and database querying tools (SQL, Query/400, etc.). Excellent communication and interpersonal skills. Preferred Skills: Experience with system migrations from legacy to modern ERP platforms. Exposure to EDI, API integrations, or middleware tools. Knowledge of manufacturing workflows, BOM, production planning, and MRP processes. Certifications in Infor M3 or related systems (a plus). Job Type: Full-time Pay: ₹35,000.00 - ₹60,000.00 per month Benefits: Health insurance Paid time off Provident Fund Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

Chandigarh, India

On-site

Linkedin logo

Company Profile Oceaneering is a global provider of engineered services and products, primarily to the offshore energy industry. We develop products and services for use throughout the lifecycle of an offshore oilfield, from drilling to decommissioning. We operate the world's premier fleet of work class ROVs. Additionally, we are a leader in offshore oilfield maintenance services, umbilicals, subsea hardware, and tooling. We also use applied technology expertise to serve the defense, entertainment, material handling, aerospace, science, and renewable energy industries. Since year 2003, Oceaneering’s India Center has been an integral part of operations for Oceaneering’s robust product and service offerings across the globe. This center caters to diverse business needs, from oil and gas field infrastructure, subsea robotics to automated material handling & logistics. Our multidisciplinary team offers a wide spectrum of solutions, encompassing Subsea Engineering, Robotics, Automation, Control Systems, Software Development, Asset Integrity Management, Inspection, ROV operations, Field Network Management, Graphics Design & Animation, and more. In addition to these technical functions, Oceaneering India Center plays host to several crucial business functions, including Finance, Supply Chain Management (SCM), Information Technology (IT), Human Resources (HR), and Health, Safety & Environment (HSE). Our world class infrastructure in India includes modern offices, industry-leading tools and software, equipped labs, and beautiful campuses aligned with the future way of work. Oceaneering in India as well as globally has a great work culture that is flexible, transparent, and collaborative with great team synergy. At Oceaneering India Center, we take pride in “Solving the Unsolvable” by leveraging the diverse expertise within our team. Join us in shaping the future of technology and engineering solutions on a global scale. Position Summary The Principal Data Scientist will develop Machine Learning and/or Deep Learning based integrated solutions that address customer needs such as inspection topside and subsea. They will also be responsible for development of machine learning algorithms for automation and development of data analytics programs for Oceaneering’s next generation systems. The position requires the Principal Data Scientist to work with various Oceaneering Business units across global time zones but also offers the flexibility to work in a Hybrid Work-office environment. Essential Duties And Responsibilities Lead and supervise a team of moderately experienced engineers on product/prototype design & development assignments or applications. Work both independently and collaboratively to develop custom data models and algorithms to apply on data sets that will be deployed in existing and new products. Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Build data models and organize structured and unstructured data to interpret solutions. Prepares data for predictive and prescriptive modeling. Architect solutions by selection of appropriate technology and components Determines the technical direction and strategy for solving complex, significant, or major issues. Plans and evaluates architectural design and identifies technical risks and associated ways to mitigate those risks. Prepares design proposals to reflect cost, schedule, and technical approaches. Recommends test control, strategies, apparatus, and equipment. Develop, construct, test, and maintain architectures. Lead research activities for ongoing government and commercial projects and products. Collaborate on proposals, grants, and publications in algorithm development. Collect data as warranted to support the algorithm development efforts. Work directly with software engineers to implement algorithms into commercial software products. Work with third parties to utilize off the shelf industrial solutions. Algorithm development on key research areas based on client’s technical problem. This requires constant paper reading, and staying ahead of the game by knowing what is and will be state of the art in this field. Ability to work hands-on in cross-functional teams with a strong sense of self-direction. Non-essential Develop an awareness of programming and design alternatives Cultivate and disseminate knowledge of application development best practices Gather statistics and prepare and write reports on the status of the programming process for discussion with management and/or team members Direct research on emerging application development software products, languages, and standards in support of procurement and development efforts Train, manage and provide guidance to junior staff Perform all other duties as requested, directed or assigned Supervisory Responsibilities This position does not have direct supervisory responsibilities. Re Reporting Relationship Engagement Head Qualifications REQUIRED Bachelor’s degree in Electronics and Electrical Engineering (or related field) with eight or more years of past experience working on Machine Learning and Deep Learning based projects OR Master’s degree in Data Science (or related field) with six or more years of past experience working on Machine Learning and Deep Learning based projects DESIRED Strong knowledge of advanced statistical functions: histograms and distributions, Regression studies, scenario analysis etc. Proficient in Object Oriented Analysis, Design and Programming Strong background in Data Engineering tools like Python/C#, R, Apache Spark, Scala etc. Prior experience in handling large amount of data that includes texts, shapes, sounds, images and/or videos. Knowledge of SaaS Platforms like Microsoft Fabric, Databricks, Snowflake, h2o etc. Background experience of working on cloud platforms like Azure (ML) or AWS (SageMaker), or GCP (Vertex), etc. Proficient in querying SQL and NoSQL databases Hands on experience with various databases like MySQL/PostgreSQL/Oracle, MongoDB, InfluxDB, TimescaleDB, neo4j, Arango, Redis, Cassandra, etc. Prior experience with at least one probabilistic/statistical ambiguity resolution algorithm Proficient in Windows and Linux Operating Systems Basic understanding of ML frameworks like PyTorch and TensorFlow Basic understanding of IoT protocols like Kafka, MQTT or RabbitMQ Prior experience with bigdata platforms like Hadoop, Apache Spark, or Hive is a plus. Knowledge, Skills, Abilities, And Other Characteristics Ability to analyze situations accurately, utilizing a variety of analytical techniques in order to make well informed decisions Ability to effectively prioritize and execute tasks in a high-pressure environment Skill to gather, analyze and interpret data. Ability to determine and meet customer needs Ensures that others involved in a project or effort are kept informed about developments and plans Knowledge of communication styles and techniques Ability to establish and maintain cooperative working relationships Skill to prioritize workflow in a changing work environment Knowledge of applicable data privacy practices and laws Strong analytical and problem-solving skills. Additional Information This position is considered OFFICE WORK which is characterized as follows. Almost exclusively indoors during the day and occasionally at night Occasional exposure to airborne dust in the work place Work surface is stable (flat) The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. This position is considered LIGHT work. OCCASIONAL FREQUENT CONSTANT Lift up to 20 pounds Climbing, stooping, kneeling, squatting, and reaching Lift up to 10 pounds Standing Repetitive movements of arms and hands Sit with back supported Closing Statement In addition, we make a priority of providing learning and development opportunities to enable employees to achieve their potential and take charge of their future. As well as developing employees in a specific role, we are committed to lifelong learning and ongoing education, including developing people skills and identifying future supervisors and managers. Every month, hundreds of employees are provided training, including HSE awareness, apprenticeships, entry and advanced level technical courses, management development seminars, and leadership and supervisory training. We have a strong ethos of internal promotion. We can offer long-term employment and career advancement across countries and continents. Working at Oceaneering means that if you have the ability, drive, and ambition to take charge of your future-you will be supported to do so and the possibilities are endless. Equal Opportunity/Inclusion Oceaneering’s policy is to provide equal employment opportunity to all applicants. Show more Show less

Posted 1 week ago

Apply

0 years

3 - 8 Lacs

Gurgaon

On-site

GlassDoor logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Senior Software Engineer What is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Our Team: The AI & DPE team is responsible for product decisioning management and innovative product development under services business unit to address the evolving risk and security needs of all of Mastercard’s various customer segments. AI & DPE team focuses on defining the strategic direction for underlying platforms to enable the successful implementation of real-time, data-driven innovative products and services focused on network, security, fraud, digital identity and authentication. The team is responsible to look across all the products/services to drive efficiency, re-usability and increase speed to market for our products and services. The candidate for this position will focus on driving actionable insights and build innovative solutions out of multiple data sources using analytics, machine learning and reporting capabilities. The Role: We are seeking a Senior Software Engineer who will: Partner with various teams (i.e., Product Manager, Data Science, Platform Strategy, Technology) on requirement gathering in order to deliver analytics solutions that generate business value Perform data preparation by ingestion, aggregation, processing to drive and enable relevant insights from available data sets Identify and code the best suited data algorithm model for the relevant insights Manipulate and analyze complex, high-volume, high-dimensionality data from varying sources using a variety of tools and data analysis techniques. Apply knowledge of metrics, measurements, and benchmarking to complex and demanding solutions Collect and synthesize feedback from clients, development, product and sales teams for new solutions or product enhancements All about You Strong SQL knowledge for data preparation and mining Strong knowledge of writing data/machine learning algorithm in Python or R Experience in doing data analysis and extraction on Hadoop Experience of working on at least one reporting tool – Tableau and PowerBI is a plus Experience in data modeling, programming, querying, data mining and report development using large volumes of granular data to deliver business intelligence and custom reporting solutions Exposure to collecting and/or working with data including standardizing, summarizing, offering initial observations and highlighting inconsistencies Strong understanding of the application of analytical methods and data visualization to support business decisions Able to work in a fast-paced, deadline-driven environment as part of a team and as an individual contributor Ability to easily move between business, analytical, and technical teams and articulate solution requirements for each group Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 1 week ago

Apply

0 years

3 - 8 Lacs

Ludhiana

On-site

GlassDoor logo

Technocrats Horizons is seeking a talented and experienced Business Analyst to join our team. As a pivotal member of our organization, the Business Analyst will play a crucial role in analyzing business processes, identifying opportunities for improvement, and facilitating the implementation of strategic initiatives. If you’re passionate about driving operational excellence, possess strong analytical skills, and thrive in a collaborative environment, we want to hear from you! Responsibilities: Client & Requirement Management : Conduct sessions with clients and sales teams to gather requirements, define user stories, and propose tailored solutions aligned with business goals. Project & Process Oversight : Manage project scope, timelines, and sprints; coordinate stand-ups, retrospectives, and ensure quality deliverables. System & Workflow Documentation : Help design, document, and maintain system processes while facilitating smooth implementation of changes. Team Coordination & Conflict Resolution : Guide development teams, support effective internal communication, resolve conflicts, and remove project obstacles. Client Relationship & Communication : Build and maintain strong client relationships, serving as the key contact to understand goals, challenges, and deliver consistent value. Account & Performance Management : Oversee client accounts, ensure satisfaction and retention through regular reviews and collaboration with internal teams. Growth & Business Development : Identify upsell/cross-sell opportunities, drive new business through referrals, and support backlog grooming with product owners. Behaviour And Character Attributes Required: Think and act like an end-user and understand the business of customers. Providing solutions through analytical and creative thinking. Attentive and active listener. Effective relation builder with teammates and clients. Systematic and structured approach, neat and clean documentation. Always thinks ahead of client perception. Professional and positive attitude. Navigates client concerns and internal challenges with a solution-first approach. Self-research technology trends worldwide. Must Have Skills: Proven Experience & Domain Knowledge : Demonstrated experience as a Business Analyst or in a similar role, with a solid understanding of software products, features, and services. Documentation Expertise : Skilled in preparing BRDs, FRDs, user stories, use cases, and process flow diagrams with adherence to documentation standards and best practices. Communication Skills : Excellent verbal and written communication abilities, including technical writing and the preparation of detailed meeting notes and reports. Agile & Scrum Familiarity : Well-versed in Agile and Scrum methodologies with practical experience in sprint-based project environments. Tool Proficiency : Proficient in using project management and collaboration tools to track progress and ensure effective team coordination. Organizational Strength : Strong organizational, time management, and multitasking skills to manage competing priorities efficiently. Collaborative & Independent Work Style : Capable of working autonomously as well as collaboratively within cross-functional teams. Good to Have: Certification in Business Analysis (e.g., CBAP, CCBA). Familiarity with industry-specific software and tools. Experience in analyzing data to draw business-relevant conclusions and in data visualization techniques and tools. Experience in a client-facing role. Knowledge of SQL or other data querying languages. Education Qualification Required: Graduate: B.Tech/B.E. in Computers, BCA in Any Specialization. PG: MBA, MCA in Computers, MS/M.Sc in Any Specialization.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 - 0 Lacs

Ahmedabad

On-site

GlassDoor logo

Post Description: Having experience of 4-8 years of development of backend web panel(admin panel) and RESTful services(API) for mobile app from scratch using any MVC Framework from Laravel (Lumen), Codeigniter. An experience of Google Map Geofence API integration for multiple cities in web & mobile. Must have previous work experience with any cloud hosted platform like Amazon Web Services(AWS), Google Cloud, etc. Experience in LEMP (Linux, Nginx, MySQL, PHP), JavaScript & AJAX. An experience of e-Commerce and any complex location based API of a nearby search result, request send/receive like uber or ola cab mobile app. Experience in design and integration of each modules of backend panel. Must have knowledge of 3rd Party API Integration like Account kit by facebook, Algolia, Paytm, Google Places, Zomato, etc. Develop a deep understanding of integration and dependencies with other systems and platforms within the architecture landscape. Strong knowledge of MySQL databases, including their construction & querying, stored procedures, triggering, Indexing and performance management. Must be able to write - clean code. Good working knowledge of the JavaScript & MVC Framework. Solid Linux background (Ubuntu). Previous use of a version control system (Git or SVN). Strong knowledge of FCM(GCM), APNS, SMS and Mail API integration. Must be able to troubleshoot, test, maintain & design the core product software and databases to ensure strong optimization and functionality for faster performance. Must have experience of change in live projects in UI/UX, new features implementation, and performance optimization. Knowledge of database architecture and various forms of normalization. Should be able to work as a single developer and part of a team. Past experience of development of cloud backend of any taxi, cab or e-commerce mobile application add more possibility of joining. Revert updated CV with below details on hr@youngbrainzinfotech or call 9978707079 Current Designation Current Company Current Location Current CTC Expected CTC Notice Period Job Type: Full-time Pay: ₹30,000.00 - ₹60,000.00 per month Benefits: Flexible schedule Schedule: Day shift Supplemental Pay: Yearly bonus Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Preferred) Experience: total work: 4 years (Preferred) Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

Ahmedabad

On-site

GlassDoor logo

Key Responsibilities: Optimize scalable data pipelines for ingestion, transformation, and loading of large datasets from diverse sources. Utilize Azure Data Factory for orchestration and automation of ETL/ELT processes, ensuring efficient data flow. Work extensively with Azure services such as Azure Blob Storage and Azure Data Lake Storage (ADLS Gen2) for efficient data storage and management. Integrate and manage data flows using Azure Logic Apps for event-driven automation and workflow orchestration. Develop and maintain robust SQL queries, stored procedures, and data models to support reporting and analytical requirements. Implement and enforce data governance, quality, and security best practices. Troubleshoot, debug, and optimize existing data pipelines and data infrastructure. Key Skills : Proven experience as a Senior Data Engineer, with a strong emphasis on designing and implementing solutions, production support, Maintenance projects & Data Engineering applications on Azure and Snowflake cloud platforms. Mandatory: Expert-level proficiency with Snowflake for data warehousing and data platform development. Mandatory: In-depth knowledge and practical experience with the Azure data ecosystem, including: Azure Data Factory (ADF) for pipeline orchestration. Azure Blob Storage and Azure Data Lake Storage (ADLS Gen2) for data storage. Azure Logic Apps for workflow automation. Exceptional command of SQL for complex querying, data manipulation, and performance tuning. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team in a fast-paced environment. Nice to Have (Added Advantage): Experience with dbt (data build tool) for data transformation and modeling. & Knowledge of other programming languages such as Python. Familiarity with DevOps practices for data engineering (CI/CD).

Posted 1 week ago

Apply

3.0 years

0 Lacs

Surat, Gujarat, India

On-site

Linkedin logo

Senior Symfony / Laravel PHP Developer We are seeking a Senior Symfony / Laravel PHP Developer to join our dynamic development team. As a key contributor, you will be responsible for designing, developing, and maintaining high-quality web applications using the Symfony / Laravel framework. You will play a vital role in scaling our backend systems and mentoring junior developers, ensuring our solutions meet modern coding standards and deliver exceptional performance. Contributions: - A Senior Symfony / Laravel PHP Developer's contributions span various crucial aspects of software development and web application deployment. Here are the key contributions they make: · Technical Leadership · Performance Optimization · Documentation and Knowledge Sharing · Security Awareness · Project Delivery Expectations: - · Database Management: Expertise in managing and querying databases (especially MySQL/PostgreSQL) and integrating them efficiently using Doctrine ORM within Symfony. · Frameworks and Technologies: In-depth understanding of the Symfony or Laravel PHP framework (must), with experience in using its components, bundles, and best practices. · Object-Oriented Programming (OOP): Strong grasp of OOP principles and design patterns as applied within Symfony-based applications to build modular, testable, and scalable code. · Integration and APIs: Skilled at consuming and creating RESTful and SOAP APIs. Able to design and implement APIs and ensure integration with third-party systems and services. · Troubleshooting and Debugging: Proficient in identifying performance bottlenecks, analysing logs, performing root cause analysis, and resolving complex issues. · Architecture and Design: Capable of designing software architectures that align with business needs. Understands MVC, event-driven programming, and reusable code architecture. · Version Control & Collaboration Tools: Hands-on experience with Git, including workflows (feature branches, merge requests), and using GitHub/GitLab for collaborative development. · Documentation and Code Quality: Adheres to clean code practices, ensures thorough inline documentation, and contributes to project-wide technical documentation for maintainability and onboarding. Capabilities: - · Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. · Problem-Solving Ability: Strong problem-solving skills to troubleshoot issues, debug code, and devise effective solutions. · Communication and Teamwork: Excellent communication skills with the ability to convey complex ideas clearly to both technical and non-technical stakeholders. · Certifications (Optional): Symfony or PHP certifications are a plus, reflecting dedication to continuous learning and expertise in the technology stack. · Proven Experience: Demonstrable 3+ years of experience as a PHP Developer (with 2+ years in Symfony / Laravel framework) , usually supported by a strong portfolio showcasing relevant projects and accomplishments. Benefits of joining Atologist Infotech 👉 Paid Leaves 👉 Leave Encashment 👉 Friendly Leave Policy 👉 5 Days Working 👉 Festivals Celebrations 👉 Friendly Environment 👉 Lucrative Salary packages 👉 Paid Sick Off 👉 Diwali Vacation 👉 Annual Big Tour 👉 Festive Off If the above requirements suit your interest, please call us on +91 9909166110 or send your resume to hr@atologistinfotech.com Show more Show less

Posted 1 week ago

Apply

10.0 - 12.0 years

7 - 9 Lacs

Indore

On-site

GlassDoor logo

Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India;Bangalore, Karnataka, India Qualification : Primary Tool & Expertise: Power BI and Semantic modelling Key Project Focus: Leading the migration of legacy reporting systems (Cognos or MicroStrategy) to Power BI solutions. Core Responsibility: Build / Develop optimized semantic models , metrics and complex reports and dashboards Work closely with the business analysts / BI teams to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the org Lead teams of BI engineers translate designed solutions to them, review their work, and provide guidance. Manage client communications and deliverables Skills Required : Power BI, Semanti Modelling, DAX, Power Query, Power BI Service, Data Warehousing, Data Modeling, Data Visualization, SQL Role : Core BI Skills: Power BI (Semanti Modelling, DAX, Power Query, Power BI Service) Data Warehousing Data Modeling Data Visualization SQL Datawarehouse, Database & Querying: Strong skills in databases (Oracle / MySQL / DB2 / Postgres) and expertise in writing SQL queries. Experience with cloud-based data intelligence platforms like (Databricks / Snowflake) Strong understanding of data warehousing and data modelling concepts and principles. Strong skills and experience in creating semantic models in the Power BI or similar tools. Additional BI & Data Skills (Good to Have): Certifications in Power BI and any Data platform Experience in other tools like MicroStrategy and Cognos Proven experience in migrating existing BI solutions to Power BI or other modern BI platforms. Experience with the broader Power Platform (Power Automate, Power Apps) to create integrated solutions Knowledge and experience in Power BI Admin features like, versioning, usage reports, capacity planning, creation of deployment pipelines etc Sound knowledge of various forms of data analysis and presentation methodologies. Experience in formal project management methodologies Exposure to multiple BI tools is desirable. Experience with Generative BI implementation. Working knowledge of scripting languages like Perl, Shell, Python is desirable. Exposure to one of the cloud providers – AWS / Azure / GCP. Soft Skills & Business Acumen: Exposure to multiple business domains (e.g., Insurance, Reinsurance, Retail, BFSI, healthcare, telecom) is desirable. Exposure to complete SDLC. Out-of-the-box thinker and not just limited to the work done in the projects. Capable of working as an individual contributor and within a team. Good communication, problem-solving, and interpersonal skills. Self-starter and resourceful, skilled in identifying and mitigating risks Experience : 10 to 12 years Job Reference Number : 13099

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon WWR&R is comprised of business, product, operational, program, software engineering and data teams that manage the life of a returned or damaged product from a customer to the warehouse and on to its next best use. Our work is broad and deep: we train machine learning models to automate routing and find signals to optimize re-use; we invent new channels to give products a second life; we develop world-class product support to help customers love what they buy; we pilot smarter product evaluations; we work from the customer backward to find ways to make the return experience remarkably delightful and easy; and we do it all while scrutinizing our business with laser focus. WWR&R data engineering team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics data lake platform. As a member of this team, your mission will be to support massively scalable, distributed data warehousing, querying, reporting and decision-support system. We support a fast-paced environment where each day brings new challenges and opportunities. As a Support Engineer, you will play a pivotal role in ensuring the stability, compliance, and operational excellence of our enterprise Data Warehouse (DW) environment. In this role, you will be responsible for monitoring and maintaining production data pipelines, proactively identifying and resolving issues that impact data quality, availability, or timeliness. You’ll collaborate closely with data engineers and cross-functional teams to troubleshoot incidents, implement scalable solutions, and enhance the overall resilience of our data infrastructure. A key aspect of this role involves supporting our data compliance and governance initiatives, ensuring systems align with internal policies and external regulatory standards such as GDPR. You will help enforce access controls, manage data retention policies, and support audit readiness through strong logging and monitoring practices. You’ll also lead efforts to automate manual support processes, improving team efficiency and reducing operational risk. Additionally, you will be responsible for maintaining clear, up-to-date documentation and runbooks for operational procedures and issue resolution, promoting consistency and knowledge sharing across the team. We’re looking for a self-motivated, quick-learning team player with a strong sense of ownership and a ‘can-do’ attitude, someone who thrives in a dynamic, high-impact environment and is eager to make meaningful contributions to our data operations. Basic Qualifications 2+ years of software development, or 2+ years of technical support experience Experience scripting in modern program languages Experience troubleshooting and debugging technical systems Preferred Qualifications Knowledge of web services, distributed systems, and web application development Experience troubleshooting & maintaining hardware & software RAID Experience with REST web services, XML, JSON Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3005883 Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Borivali, Maharashtra, India

On-site

Linkedin logo

Job Summary We are looking for a detail-oriented and tech-savvy MIS Executive to support our data management and reporting needs. The ideal candidate will be responsible for maintaining and analyzing company data, generating reports, and assisting departments in data-driven decision-making. This role is key to ensuring timely, accurate, and actionable information flow within the organization. Key Responsibilities Develop, maintain, and update MIS reports and dashboards regularly (daily, weekly, monthly). Analyze data to identify trends, anomalies, and insights to support management decisions. Generate reports related to sales, operations, finance, HR, or other business units as required. Automate reporting processes and improve data handling efficiency using Excel, SQL, or BI tools. Maintain data accuracy and integrity across internal databases and systems. Work closely with departments to gather reporting requirements and ensure delivery timelines. Prepare ad-hoc data analysis and reports for senior management. Ensure compliance with data governance and security protocols. Requirements Bachelor's degree in Computer Science, IT, Statistics, Business Administration, or related field. 1–3 years of experience in an MIS, data analysis, or reporting role. Strong knowledge of MS Excel (VLOOKUP, Pivot Tables, Macros) and data visualization tools (Power BI, Tableau preferred). Familiarity with SQL and database querying. Excellent analytical, problem-solving, and organizational skills. Ability to manage multiple tasks and work under tight deadlines. Strong communication skills and attention to detail. Preferred Qualifications Experience in ERP or CRM systems (e.g., SAP, Salesforce). Knowledge of scripting languages or automation tools (e.g., Python, R, Power Automate). Exposure to business process optimization and performance tracking. Why Join Us Be part of a data-driven, forward-thinking team. Opportunity to work on impactful projects that influence business strategy. Dynamic and collaborative work environment. Competitive salary and growth opportunities. Skills: data management,data maintenance,attention to detail,problem-solving,communication,excel (vlookup, pivot tables, macros),sql,mis reporting,data visualization (power bi, tableau),data analysis Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Kanayannur, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

About The Role We are looking for a passionate and skilled AI/ML Developer to join our dynamic team. The ideal candidate will have strong experience with LangChain, Retrieval-Augmented Generation (RAG), Agentic AI systems, as well as expertise in clustering, regression, deep learning, and data transformation/cleaning techniques. You will work on building intelligent, autonomous AI solutions that drive innovative business applications. Responsibilities Design and develop AI applications using LangChain and Agentic AI architectures. Implement RAG pipelines for advanced knowledge retrieval and reasoning tasks. Build machine learning models for clustering, regression, and prediction tasks. Develop and fine-tune deep learning models for text-based or tabular data. Perform thorough data cleaning, transformation, and feature engineering. Write efficient, production-quality Python code. Work with SQL databases for data querying and preparation. Research and stay updated with new techniques in LLMs, Agentic AI, and machine learning. Requirements Excellent programming skills in Python. Proficient in SQL for database querying and data manipulation. Strong understanding of machine learning algorithms (especially clustering and regression). Hands-on experience with deep learning frameworks (TensorFlow, PyTorch, or Keras). Practical experience with LangChain, RAG, and building autonomous AI agents. Expertise in data preprocessing, cleaning, and transformation techniques. Preferred Skills Experience working with LLMs (Large Language Models) and prompt engineering. Knowledge of vector databases like Pinecone, FAISS, or ChromaDB. Strong analytical thinking and problem-solving skills. Ability to work independently and in collaborative environments. Skills: data transformation,machine learning,artificial intelligence,regression,retrieval-augmented generation (rag),python,langchain,deep learning,data cleaning,clustering,skills,agentic ai systems,sql Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Data Scientist – Roles And Responsibilities We are seeking a skilled Data Scientist to join our team and leverage data to create actionable insights and innovative solutions. The ideal candidate will have strong analytical skills, expertise in statistical modeling, and proficiency in programming and machine learning techniques. You will work closely with cross-functional teams to identify business opportunities, optimize processes, and develop data-driven strategies. Key Responsibilities Data Collection & Preparation: Gather, clean, and preprocess large datasets from various sources to ensure data quality and usability. Exploratory Data Analysis: Perform in-depth analysis to identify trends, patterns, and correlations that inform business decisions. Model Development : Design, build, and deploy machine learning models and statistical algorithms to solve complex problems, such as predictive analytics, classification, or recommendation systems. Data Visualization : Create compelling visualizations and dashboards to communicate insights to stakeholders using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Collaboration : Work with team leads, engineers, and business leaders to understand requirements, define key metrics, and translate insights into actionable strategies. Experimentation : Design and analyze A/B tests or other experiments to evaluate the impact of business initiatives. Automation : Develop pipelines and scripts to automate data processing and model deployment. Keep up with advancements in data science, machine learning, and industry trends to implement cutting-edge techniques. Preferred Qualifications Experience with deep learning, natural language processing (NLP), or computer vision. Knowledge of software engineering practices, such as version control (Git) and CI/CD pipelines. Contributions to open-source projects or publications in data science. Technical Skills Proficiency in programming languages like Python Experience with SQL for querying and managing databases. Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with big data tools (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is a plus. Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Strong understanding of statistics, probability, and experimental design Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Strong SQL & Reporting analyst having strong expertise in Extract, Transform, Load (ETL) processes, SQL querying, and reporting, with hands-on experience in an AWS cloud environment. Key Responsibilities Design, develop, and maintain scalable ETL workflows to extract data from various sources, transform it based on business requirements, and load it into data warehouses or databases. SQL Querying: Write complex SQL queries to extract, manipulate, and analyze data for reporting and ad-hoc business requests. Reporting: Create, optimize, and automate reports and dashboards using Power BI, or AWS QuickSight to deliver actionable insights to stakeholders. AWS Environment: Understanding of how to leverage AWS services (e.g., S3, Redshift, Glue, Lambda) to store, process, and manage data efficiently in a cloud-based ecosystem. Data Integrity: Ensure data accuracy, consistency, and quality throughout the ETL and reporting processes by implementing validation checks and troubleshooting issues. Collaboration: Work closely with data engineers, business analysts, and stakeholders to understand data needs and deliver tailored solutions. Optimization: Monitor and optimize ETL processes and queries for performance and scalability. Documentation: Maintain clear documentation of ETL processes, data models, and reporting logic for future reference and team knowledge sharing. Soft Skills Communication: Ability to communicate effectively with non-technical stakeholders to understand requirements. Team Collaboration: Experience working in teams using Agile or Scrum methodologies. Technical Skills Experience: 3+ years of experience in ETL development, data analysis, or a similar role. SQL Skills: Advanced proficiency in writing and optimizing complex SQL queries (e.g., joins, subqueries, window functions). AWS Expertise: Hands-on experience with AWS services such as S3, Redshift, Glue, Lambda, or Athena for data storage and processing. Reporting Tools: Proficiency in visualizations and dashboards using Power BI Programming: Familiarity with scripting languages like Python or Bash for automation and data manipulation is a plus. Data Concepts: Strong understanding of data warehousing, data modeling, and ETL best practices. Problem-Solving: Ability to troubleshoot data issues and optimize processes efficiently. Communication: Excellent verbal and written communication skills to collaborate with technical and non-technical stakeholders. Show more Show less

Posted 1 week ago

Apply

200.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Push the limits of what’s possible with us as an experienced member of our Software Engineering team. As an Experienced Software Engineer at JPMorgan Chase within the Global Technology team, you serve as member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. Depending on the team that you join, you could be developing mobile features that give our customers and clients more control over how they bank with us, strategizing on how big data can make our trading systems quicker, creating the next innovation in payments for merchants, or supporting the integration of our private and public cloud platforms. Job Responsibilities Participates in, design and develop scalable and resilient systems using Java or Python to contribute to continual, iterative improvements for product teams Executes software solutions, design, development, and technical troubleshooting Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces or contributes to architecture and design artifacts for applications while ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Hands-on practical experience in system design, application development, testing and operational stability Proficient in coding in Java or Python languages Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Exposure to cloud technologies Your Role Level JPMorgan Chase is looking to hire Software Engineers at Software Engineer II and Software Engineer III levels. A determination will be made on placement for successful candidates based on the results of a skills-based assessment which applicants will be asked to complete during the hiring process, as well as, the candidate interview. The assessment will evaluate ability to perform basic coding and systems design responsibilities. For the Software Engineer II level, the role requires the ability to understand advance features of a coding language, design a viable system, and solve functional problems through basic language applications. For the Software Engineer III level, the role requires a higher level of proficiency and the ability to function independently, including the ability to use and explain advance features of a coding language, design systems across technologies and platforms, solve functional and non-functional problems through an application of language best practices, as well as, the ability to assess issues broadly, identify alternative or innovative solutions, collaborate effectively, and provide guidance to others. About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Global Technology team relies on smart, driven people like you to develop applications and provide tech support for all our global functions across our network. Your efforts will touch lives all over the financial spectrum and across all our lines of business: Consumer & Community Banking, Asset & Wealth Management, Commercial Banking, Corporate Investment Banking and Corporate Sector. You’ll be part of a team specifically built to meet and exceed our evolving technology needs, as well as our technology controls agenda. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About The Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovate with new approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practices such as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Data Scientist – Roles And Responsibilities We are seeking a skilled Data Scientist to join our team and leverage data to create actionable insights and innovative solutions. The ideal candidate will have strong analytical skills, expertise in statistical modeling, and proficiency in programming and machine learning techniques. You will work closely with cross-functional teams to identify business opportunities, optimize processes, and develop data-driven strategies. Key Responsibilities Data Collection & Preparation: Gather, clean, and preprocess large datasets from various sources to ensure data quality and usability. Exploratory Data Analysis: Perform in-depth analysis to identify trends, patterns, and correlations that inform business decisions. Model Development : Design, build, and deploy machine learning models and statistical algorithms to solve complex problems, such as predictive analytics, classification, or recommendation systems. Data Visualization : Create compelling visualizations and dashboards to communicate insights to stakeholders using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Collaboration : Work with team leads, engineers, and business leaders to understand requirements, define key metrics, and translate insights into actionable strategies. Experimentation : Design and analyze A/B tests or other experiments to evaluate the impact of business initiatives. Automation : Develop pipelines and scripts to automate data processing and model deployment. Keep up with advancements in data science, machine learning, and industry trends to implement cutting-edge techniques. Preferred Qualifications Experience with deep learning, natural language processing (NLP), or computer vision. Knowledge of software engineering practices, such as version control (Git) and CI/CD pipelines. Contributions to open-source projects or publications in data science. Technical Skills Proficiency in programming languages like Python Experience with SQL for querying and managing databases. Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with big data tools (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is a plus. Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Strong understanding of statistics, probability, and experimental design Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Supply Chain Applications Specialist Job Type: Full-Time Experience Level: Mid-Senior Level Job Summary: We are seeking an experienced Supply Chain Applications Analyst to support and enhance our ERP ecosystem, including Infor M3 and legacy platforms such as VAX/VMS and AS400. The ideal candidate should have a solid understanding of supply chain processes, ERP workflows, and integration across systems, with hands-on experience supporting and optimizing business applications in a manufacturing or distribution environment. Key Responsibilities: Provide application support for Infor M3 ERP, including configuration, troubleshooting, and functional enhancements. Maintain and support legacy systems based on VAX/VMS and AS400 platforms. Collaborate with business users to understand supply chain requirements and translate them into technical solutions. Monitor system performance and resolve issues to ensure continuous and efficient operations. Manage data migration and system integration efforts between Infor M3 and legacy systems. Document system configurations, support procedures, and user manuals. Participate in system upgrades, patch management, and testing efforts. Provide training and support to end-users across supply chain functions (procurement, inventory, logistics, etc.). Work closely with IT, vendors, and cross-functional teams for project execution and support. Required Qualifications: Bachelor’s degree in Computer Science, Information Systems, Supply Chain, or related field. 4–8 years of experience working with ERP systems (especially Infor M3) in a supply chain or manufacturing context. Strong knowledge of VAX/VMS, AS400 (iSeries) platforms. Experience in supporting supply chain modules such as procurement, inventory, warehouse, logistics, and order management. Strong analytical and problem-solving skills. Familiarity with scripting, report generation, and database querying tools (SQL, Query/400, etc.). Excellent communication and interpersonal skills. Preferred Skills: Experience with system migrations from legacy to modern ERP platforms. Exposure to EDI, API integrations, or middleware tools. Knowledge of manufacturing workflows, BOM, production planning, and MRP processes. Certifications in Infor M3 or related systems (a plus). Show more Show less

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position Summary Senior Analyst, Data-Marts & Reporting - Reporting and Analytics – Digital Data Analytics Innovation - Deloitte Support Services India Private Limited Are you a quick learner with a willingness to work with new technologies? Data-Marts and Reporting team offers you a particular opportunity to be an integral part of the Datamarts & Reporting – CoRe Digital | Data | Analytics | Innovation Group. The principle focus of this group is the research, development, maintain and documentation of customized solutions that e-enable the delivery of cutting- edge technology to firm's business centers. Work you will do As a Senior Analyst, you will research and develop solutions built on varied technologies like Microsoft SQL Server, MSBI Suite, MS Azure SQL, Tableau, .Net. You will support a team which provides high-quality solutions to the customers by following a streamlined system development methodology. In the process of acquainting yourself with various development tools, testing tools, methodologies and processes, you will be aligned to the following role: Role: Datamart Solution Senior Analyst As a Datamart Solution Analyst, you will be responsible for delivering technical solutions on building high performing datamart and reporting tools using tools/technologies like Microsoft SQL Server, MSBI Suite, MS Azure SQL, Tableau, .Net. Your key responsibilities include: Interact with end users to gather, document, and interpret requirements. Leverage requirements to design technical solution. Develop SQL objects and scripts based on design. Analyze, debug, and optimize existing stored procedures and views. Leverage indexes, performance tuning techniques, and error handling to improve performance of SQL scripts. Create and modify SSIS packages, ADF Pipelines for transferring data between various systems cloud and On-premise environments. Should be able to seamlessly work with different Azure services. Improve performance and find opportunities to improvise process to bring in efficiency in SQL, SSIS and ADF. Create, schedule and monitor SQL jobs. Build interactive visualizations in Tableau for leadership reporting. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Communicate clearly and regularly with team leadership and project teams. Manage ongoing deliverable timelines and own relationships with end clients to understand if deliverables continue to meet client’s need. Work collaboratively with other team members and end clients throughout development life cycle. Research, learn, implement, and share skills on new technologies. Understand the customer requirement well and provide status update to project lead (US/USI) on calls and emails efficiently. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Guide junior members in team to get them up to speed in domain, tool and technologies we work on. Continuously improves skills in this space by completing certification and recommended training. Obtain and maintain a thorough understanding of the MDM data model, Global Systems & Client attributes. Good understanding of MVC .Net, Sharepoint front end solutions. Good to have knowledge on Full stack development. The team CoRe - Digital Data Analytics Innovation (DDAI) sits inside Deloitte’s global shared services organization and serves Qualifications and experience Required: Educational Qualification: B.E/B.Tech or MTech (60% or 6.5 GPA and above) Should be proficient in understanding of one or more of the following Technologies: Knowledge in DBMS concepts, exposure to querying on any relational database preferably MS SQL Server, MS Azure SQL, SSIS, Tableau. Knowledge on any of the coding language like C#. NET or VB .Net would be added advantage. Understands development methodology and lifecycle. Excellent analytical skills and communication skills (written, verbal, and presentation) Ability to chart ones’ own career and build networks within the organization Ability to work both independently and as part of a team with professionals at all levels Ability to prioritize tasks, work on multiple assignments, and raise concerns/questions where appropriate Seek information / ideas / establish relationship with customer to assess any future opportunities Total Experience: 4-6 years of overall experience At least 3 years of experience in data base development, ETL and Reporting Skill set Required: SQL Server, MS Azure SQL, Azure Data factory, SSIS, Azure Synapse, Data warehousing & BI Preferred: Tableau, .Net Good to have: MVC .Net, Sharepoint front end solutions. Location: Hyderabad Work hours: 2 p.m. – 11 p.m. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team- based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world- class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/project requirements and at the discretion of the management. #EAG-Core Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300780 Show more Show less

Posted 1 week ago

Apply

Exploring Querying Jobs in India

The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.

Career Path

In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager

Related Skills

Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts

Interview Questions

  • What is the difference between SQL and NoSQL databases? (basic)
  • Explain the purpose of the GROUP BY clause in SQL. (basic)
  • How do you optimize a slow-performing SQL query? (medium)
  • What are the different types of joins in SQL? (medium)
  • Can you explain the concept of ACID properties in database management? (medium)
  • Write a query to find the second-highest salary in a table. (advanced)
  • What is a subquery in SQL? Provide an example. (advanced)
  • Explain the difference between HAVING and WHERE clauses in SQL. (advanced)

Closing Remark

As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies