Home
Jobs

1759 Querying Jobs - Page 33

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Overview: As a Backend Developer at ShyftLabs, you will work closely with our cross-functional teams, including product managers, developers, and stakeholders, to create user-centric and visually appealing designs for our digital solutions. You will be responsible for developing and delivering high-quality user interfaces and user experiences that meet our clients’ needs and exceed their expectations. ShyftLabs is a rapidly growing data product company that specializes in creating digital solutions for Fortune 500 companies across various industries. We are looking for a talented and experienced Backend Developer to join our team and help us continue to deliver innovative solutions that help accelerate business growth. Job Responsibilities: Design and architect scalable, high-performance backend systems. Lead the development of system architecture, including high-level and low-level design (HLD/LLD). Ensure that the system design adheres to industry best practices and principles, such as microservices, distributed systems, and cloud-native architectures. Develop and optimize backend components, including APIs, databases, and server-side logic. Implement best practices for security, performance, and scalability in backend development. Monitor and optimize system performance, ensuring low latency and high availability. Implement logging, monitoring, and alerting systems to ensure system health and quick troubleshooting. Collaborate with cross-functional teams, including frontend developers, product managers, and DevOps, to deliver end-to-end solutions. Help set and maintain high coding standards within the team. Basic Qualifications: Strong Java skills with a minimum of 4 years of experience in Java. At least 1+ year of experience in Kotlin. Experience with GraphQL for efficient data querying and API development. Familiarity with relational databases like Postgres and ORM frameworks such as Hibernate. Experience working with MongoDB for NoSQL database management and Kafka for event-driven architecture and message streaming. Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Supervising Security Analyst - Cyber Triage and Forensics Today’s world is fueled by vast amounts of information. Data is more valuable than ever before. Protecting data and information systems is central to doing business, and everyone in EY Information Security has a critical role to play. Join a global team of almost 950 people who collaborate to support the business of EY by protecting EY and client information assets! Our Information Security professionals enable EY to work securely and deliver secure products and services, as well as detect and quickly respond to security events as they happen. Together, the efforts of our dedicated team helps protect the EY brand and build client trust. Within Information Security we blend risk strategy, digital identity, cyber defense, application security and technology solutions as we consider the entire security lifecycle. You will join a team of hardworking, security-focused individuals dedicated to supporting, protecting and enabling the business through innovative, secure solutions that provide speed to market and business value. The opportunity The Senior Security Analyst in Cyber Defense CTF (Cyber Triage and Forensics) plays a pivotal role in enhancing EY’s security posture by vigilantly monitoring, assessing, and managing incidents effectively. In collaboration with the team and leadership, to ensure strong security oversight and contribute to joint security monitoring and incident response initiatives. Key duties include triage, detailed investigations, clear communication, and comprehensive reporting, all contributing to the integrity and resilience of the EY’s cyber defenses. Essential Functions of the Job : Perform forensic and malware analysis to detect, investigate, and resolve security incidents, including artifact classification and payload extraction Engage in proactive threat hunting and provide expert security assessments, utilizing EDR, SIEM, and other tools to understand and counteract the cybercrime landscape Communicate with IT stakeholders during incident response activities, ensuring effective containment, remediation, and accurate identification of compromise indicators Report on incident metrics, analyse findings, and develop reports to ensure comprehensive resolution and understanding of security events Act as an escalation point for incident response, shift lead, mentor junior team members, and contribute to team skill enhancement Analyse security events, provide feedback on security controls, and drive process improvements to strengthen the organization's security posture Maintain and improve security incident processes, protocols, and standard operating procedures to reflect best practices in security incident response Skills And Attributes For Success Proficient in digital forensics, including evidence management in line with best practices and using advanced tools for threat detection and incident management including advanced querying with KQL Skilled in analyzing diverse data, identifying malware, and employing reverse engineering to reveal hidden threats Proficient in conducting detailed forensic investigations across various operating systems, with a keen eye for obfuscation and the ability to clearly communicate findings In-depth understanding of Active Directory security, with strong scripting abilities to automate response measures and improve operational effectiveness To qualify for the role, you must have Undergraduate or Postgraduate Degree in Computer Science, Engineering, or a related field (MCA/MTech/BTech/BCA/BSc CS or BSc IT) At least 7 years of overall experience with a minimum of 5 years specialized in incident response, computer forensics, and malware reverse engineering Proficiency in operating within a Security Monitoring/Security Operations Center (SOC) environment, including experience with CSIRT and CERT operations Demonstrated experience in investigating security events, threats, and vulnerabilities Strong understanding of electronic investigation and forensic methodologies, including log correlation, electronic data handling, investigative processes, and malware analysis In-depth knowledge of Windows and Unix/Linux operating systems, and experience with EDR solutions for threat detection and response Ideally, you’ll also Possession of or willingness to obtain professional certifications like GREM, GCFE, GCFA, or GCIH Experience with security incident response in cloud environments, including Azure. Knowledge of legal considerations in electronic discovery and analysis Proficiency in scripting or programming (e.g., Shell scripting, PowerShell, C, C#, Python) Solid understanding of security best practices for network architecture and server configuration What We Look For Demonstrates integrity in a professional environment Strong ethical behavior Ability to work independently Possesses a global mindset for working with diverse cultures and backgrounds Knowledgeable in industry-standard security incident response processes, procedures, and lifecycle Positive attitude and Excellent teaming skills Excellent social, communication, and writing skills Good presentation skills Excellent investigative, analytical, and problem-solving skills Supervising Responsibilities: Coordinate escalations and collaborate with internal technology teams to ensure timely resolution of issues Provide mentoring and training to other team members as required, supporting their development and ensuring consistent team performance Other Requirements: Should be willing to work in shifts What We Offer As part of this role, you will work in a highly coordinated, globally diverse team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You will develop the mindset and skills to navigate whatever comes next. Success as defined by you: We will provide the tools and flexibility, so you can make a significant impact, your way. Transformative leadership: We will give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You will be accepted for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Build the future of the AI Data Cloud. Join the Snowflake team. We are seeking a talented and motivated Analytics Engineer to join our team in Pune, India. This role will be pivotal in building and maintaining the data infrastructure that powers our cutting-edge AI applications, enabling us to deliver intelligent solutions to our customers and internal stakeholders. If you are passionate about data, AI, and working with a world-class cloud data platform, we want to hear from you. THE ROLE As an Analytics Engineer focused on AI applications, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines that feed our machine learning models and AI-driven features. You will collaborate closely with data scientists, AI researchers, software engineers, and product managers to understand data requirements and deliver high-quality data solutions. Your work will directly impact the performance and reliability of our AI systems, contributing to Snowflake's innovation in the AI space. Job Description As an Analytics Engineer supporting AI Applications, you will: Data Pipeline Development & Maintenance: Design, build, and maintain scalable, reliable ETL/ELT pipelines in Snowflake to support AI model training, evaluation, and deployment. Integrate data from various sources, including internal systems, Salesforce, and other external vendor platforms. Develop a willingness to learn B2B concepts and the intricacies of diverse data sources. Implement data quality frameworks and ensure data integrity for AI applications. System Integration & Automation: Develop and automate data processes using SQL, Python, and other relevant technologies. Work with modern data stack tools and cloud-based data platforms, with a strong emphasis on Snowflake. MLOps Understanding & Support: Gain an understanding of MLOps principles and contribute to the operationalization of machine learning models. Support data versioning, model monitoring, and feedback loops for AI systems. Release Management & Collaboration: Participate actively in frequent release and testing cycles to ensure the high-quality delivery of data features and reduce risks in production AI systems. Develop and execute QA/test strategies for data pipelines and integrations, often coordinating with cross-functional teams. Gain experience with access control systems CI/CD pipelines, and release testing methodologies to ensure secure and efficient deployments. Performance Optimization & Scalability: Monitor and optimize the performance of data pipelines and queries. Ensure data solutions are scalable to handle growing data volumes and evolving AI application needs. What You Will Need Required Skills: Bachelor's or Master's degree in Computer Science, Engineering, or a related STEM (Science, Technology, Engineering, Mathematics) field. Strong proficiency in SQL for data manipulation, querying, and optimization. Proficiency in Python for data processing, automation, and scripting. Hands-on experience with Snowflake or other cloud-based data platforms (e.g., AWS Redshift, Google BigQuery, Azure Synapse). A proactive and collaborative mindset with a strong desire to learn new technologies and B2B concepts. Preferred Skills: Experience in building and maintaining ETL/ELT pipelines for AI/ML use cases. Understanding of MLOps principles and tools. Experience with data quality frameworks and tools. Familiarity with data modeling techniques. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Knowledge of software engineering best practices, including version control (e.g., Git), CI/CD, and testing. Experience coordinating QA/test strategies for cross-team integration. Familiarity with access control systems (e.g., Okta) and release testing. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Sr. Business Analyst Key Responsibilities Requirement Elicitation & Analysis Conduct stakeholder interviews, workshops, and JAD sessions to gather functional and non-functional requirements. Perform detailed analysis to understand end-user needs and define clear and comprehensive business requirements. Evaluate current systems/processes and propose enhancements. Product Specification & Documentation Convert requirements into User Stories, Use Cases, and Acceptance Criteria in tools like JIRA, Planner Maintain Product Backlogs and contribute to Sprint Planning with the Agile team. Create supporting documents such as Process Flows, Wireframes, and Data Flow Diagrams. Stakeholder Management Collaborate with cross-functional teams including Product Owners, Developers, QA Engineers, and UI/UX Designers. Act as the bridge between technical teams and non-technical stakeholders to ensure mutual understanding. Product Lifecycle Management Support the entire product lifecycle from ideation to post-launch reviews. Participate in Product Roadmap discussions and strategic planning. Conduct GAP Analysis, Feasibility Studies, and Competitive Benchmarking. Testing & Quality Assurance Design and execute UAT plans, and support QA teams in developing test cases. Validate product releases and ensure alignment with business goals and compliance standards. Required Skills & Tools Strong knowledge of Agile (Scrum/Kanban) and SDLC methodologies Expertise in tools like: JIRA, Confluence, Trello Figma, Balsamiq, Lucidchart (for wireframes and workflows) SQL (for data analysis and querying) Excellent documentation, presentation, and stakeholder communication skills Ability to handle multiple projects simultaneously and work in a fast-paced environment Qualifications Bachelor’s/Master’s degree in Business Administration, Computer Science, Information Technology, or related field 5–8 years of experience in Business Analysis, preferably in a product-based or SaaS environment Professional certification is a plus: CBAP, PMI-PBA, CSPO, or Agile BA certifications Preferred Domain Experience FinTech, HealthTech, EdTech, E-commerce, or SaaS platforms Working with B2B/B2C product lines Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Summary We are seeking a skilled Developer with 5 to 8 years of experience to join our team. The ideal candidate will have expertise in Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. Experience in Property & Casualty Insurance is a plus. This is a hybrid role with day shifts and no travel required. Responsibilities Develop and maintain data pipelines using Amazon S3 and Amazon Redshift to ensure efficient data storage and retrieval. Utilize Python to write clean scalable code for data processing and analysis tasks. Implement Databricks SQL for querying and analyzing large datasets to support business decisions. Manage and optimize Databricks Delta Lake for reliable and high-performance data storage. Design and execute Databricks Workflows to automate data processing tasks and improve operational efficiency. Leverage PySpark to perform distributed data processing and enhance data transformation capabilities. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Ensure data quality and integrity by implementing robust validation and monitoring processes. Provide technical support and troubleshooting for data-related issues to maintain smooth operations. Stay updated with the latest industry trends and technologies to continuously improve data solutions. Contribute to the development of best practices and standards for data engineering within the team. Document technical specifications and processes to ensure knowledge sharing and continuity. Participate in code reviews and provide constructive feedback to peers for continuous improvement. Qualifications Possess strong expertise in Amazon S3 and Amazon Redshift for data storage and management. Demonstrate proficiency in Python for developing scalable data processing solutions. Have hands-on experience with Databricks SQL for data querying and analysis. Show capability in managing Databricks Delta Lake for high-performance data storage. Exhibit skills in designing Databricks Workflows for automating data processes. Utilize PySpark for distributed data processing and transformation tasks. Experience in Property & Casualty Insurance domain is a plus. Strong problem-solving skills and ability to troubleshoot data-related issues. Excellent communication and collaboration skills to work effectively with cross-functional teams. Ability to stay updated with the latest industry trends and technologies. Strong documentation skills for maintaining technical specifications and processes. Experience in participating in code reviews and providing constructive feedback. Commitment to maintaining data quality and integrity through robust validation processes. Certifications Required AWS Certified Solutions Architect Databricks Certified Data Engineer Associate Python Certification Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Oracle is leading the digital revolution. We are empowering nearly half a million businesses to thrive in the age of skyrocketing connections. Join us and play an instrumental role in masterminding the software that will have a truly global impact. OFSS Consulting is part of the Financial Services Global unit and provides a variety of consulting services covering new implementations, upgrades, Customization, and Managed services for a variety of Industry products like FLEXCUBE, DIGIX – Oracle Digital Banking experience, Analytics, Pricing and Billing, Leasing and Lending and Oracle Banking Products covering Retail, Corporate, Investment and Wealth and Private banking domains. What You’ll Do We are seeking aspirational graduates interested in a career in Consulting to join our niche Banking Domain and Practice. This gives you an opportunity to apply your technology knowledge, skills, and experience to work in the Banking Consulting team with a new generation of Oracle Banking products in next generation architecture built using the newest technologies. Longer term you will grow, with the help of extensive training and experience of the team around you, into a team lead and eventually project managers or authorities in Business domain or Solution Architecture with full accountability and responsibility of the delivered solution for your own projects. The position will: Support Senior Consultants Project Managers and teams of hardworking, professional business and technology consultants in the delivery of business focused solutions for our clients using Oracle applications, tools and technology. Using sound product skills and experience, will initially work on specific project tasks to achieve successful project outcomes and client reference ability. Develop skills and competence on our products in terms of functionality, design, and architecture Develop extensions or customization around our products in context of customer asks, be it UI/UX or Business functionality, Interfaces and Integration or Reports and visualizations. Assist in the testing and deployment of the product and the customization and developments Prepare documentation - program specifications, unit test plans, test cases, user documentation, release notes and status reports What You’ll Bring Your drive, knowledge, and commitment will help us become the number one cloud company in the world. We also look for: Primary Skills: You possess experience in end-to-end Development and / or Implementation and / or Support activities covering expertise areas such as design of customizations, Coding and Unit testing, completing Test cycle rounds including End of Days, migrations, and Integrations for Oracle FLEXCUBE / Core banking products You possess knowledge and skills in software programming in Oracle PL/SQL and Core Java, JavaScript and XML. You have working knowledge of Release management and Source control tools You should be able to perform issue tracking for applications and follow-up for resolution of same with collaborators You possess good client interaction skills in areas including presentation of solutions You have exposure to software deployment and solving on Application Server software especially Oracle Weblogic You have exposure to analysis of Oracle Database AWR/ADDM reports and fixing of database performance issues. You have awareness of banking terminologies and concepts You possess IT skills including Microsoft Office, Basic SQL querying You possess good communication & documentation skills in English Experience in development and / or implementation and / or support of core banking applications Should hold a bachelor’s degree in computer science or equivalent degree Should be willing to work at offshore as well as travel to client locations Ability to work in a high pressure, fast paced and exciting environment 8 to 15 years of experience as technical consultant for (Java and PL/SQL with any core banking application) or (PL/SQL and FLEXCUBE UBS) is required Solid Understanding on Core Java, PL/SQL Development / Implementation experience in FLEXCUBE / core banking domain projects Knowledge of SQL is required Banking domain knowledge in new areas such as blockchain etc. is a plus Knowledge of any of the following is a plus - DevOps tools, Testing tools, Oracle OBIEE Reports, BIP, middleware such as Oracle Fusion SOA Suite Secondary Skills: Experience in working in BFSI Domain and specifically Banking is important Should have strong Oracle tech skills - PL/SQL, SQL, Java. Should have worked on end-to-end implementation of the solution from inception to debut Strong problem-solving skills and should have participated in supporting critical phases of projects Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. Career Level - IC2 Responsibilities What You’ll Do We are seeking aspirational graduates interested in a career in Consulting to join our niche Banking Domain and Practice. This gives you an opportunity to apply your technology knowledge, skills, and experience to work in the Banking Consulting team with a new generation of Oracle Banking products in next generation architecture built using the newest technologies. Longer term you will grow, with the help of extensive training and experience of the team around you, into a team lead and eventually project managers or authorities in Business domain or Solution Architecture with full accountability and responsibility of the delivered solution for your own projects. The position will: Support Senior Consultants Project Managers and teams of hardworking, professional business and technology consultants in the delivery of business focused solutions for our clients using Oracle applications, tools and technology. Using sound product skills and experience, will initially work on specific project tasks to achieve successful project outcomes and client reference ability. Develop skills and competence on our products in terms of functionality, design, and architecture Develop extensions or customization around our products in context of customer asks, be it UI/UX or Business functionality, Interfaces and Integration or Reports and visualizations. Assist in the testing and deployment of the product and the customization and developments Prepare documentation - program specifications, unit test plans, test cases, user documentation, release notes and status reports Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 weeks ago

Apply

35.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description Company Overview: When it comes to IT solution providers, there are a lot of choices. But when it comes to providers with innovative and differentiating end-to-end service offerings, there’s really only one: Zones – First Choice for IT.TM Zones is a Global Solution Provider of end-to-end IT solutions with an unmatched supply chain. Positioned to be the IT partner you need, Zones, a Minority Business Enterprise (MBE) in business for over 35 years, specializes in Digital Workplace, Cloud & Data Center, Networking, Security, and Managed/Professional/Staffing services. Operating in more than 120 countries, leveraging a robust portfolio, and utilizing the highest certification levels from key partners, including Microsoft, Apple, Cisco, Lenovo, Adobe, and more, Zones has mastered the science of building digital infrastructures that change the way business does business ensuring whatever they need, they can Consider IT Done. Follow Zones, LLC on Twitter @Zones, and LinkedIn and Facebook. Position Overview Looking for strong System Engineer who would be responsible for supporting and maintaining business intelligence dashboards and reports. This role focuses on ensuring optimal performance and usability of visualizations created in Power BI and Oracle Analytics Cloud. Need someone with strong skillset to ensure production (Operation/support) related activities are delivered as per SLA. Work activities involve bug fixes, break fix, changes, co-ordinate with the development team in case of any issues, work on enhancements. What you’ll do as the System Engineer - Oracle Analytics Cloud & Power BI: The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Hands-on experience with Power BI (report building, dashboard design, DAX, Power Query). Strong knowledge of Oracle Analytics Cloud, including dashboard configuration and data integration. Familiarity with SQL and data querying to support data extraction and troubleshooting. Power BI developer having Experience in Creating reports & dashboards using DAX. Excellent in using DAX functions, and M query language. Good understanding of Power BI premium services Excellent knowledge of Dataset Refresh using Gateways, configuring, and managing gateways Strong understanding of service features, governance, deployment pipelines, data gateway, schedule refreshes, XMLA end points Experienced in implementing and managing Row Level Security, workspace creation and manage workspace access. Excellent knowledge in Performance optimization and tuning of PBI reports. Weekly and monthly presentation to the business users about the reports and their changes as required. Strong working knowledge of SQL queries Worked on Power BI Visuals including Tree Map, Funnel, Line Chart Knowledge working with Map Reports and Stacked Reports Strong Knowledge on Power Query Lists, Tables and Records Created Scorecards, Bookmarks, ToolTip, conditional formatting, Drill Down Reports, Drill Through Report. What You Will Bring To The Team B.E/B. Tech/M.Tech in Computer Science or related technical degree. Knowledge BI & Oracle cloud. Knowledge on SQL is added advantage. Fast learner and good problem-solving skills Good oral and written communication skills Ability to manage multiple users and systems Zones offers a comprehensive Benefits package. While we’re committed to providing top-tier solutions, we’re just as committed to supporting our own teams. We offer a competitive compensation package where our team members are rewarded based on their performance and recognized for the value, they bring into our business. Our team members enjoy a variety of comprehensive benefits, including Medical Insurance Coverage, Group Term Life and Personal Accident Cover to handle the uncertainties of life, flexible leave policy to balance their work life. At Zones, work is more than a job – it's an exciting careers immersed in an inventive, collaborative culture. If you’re interested in working on the cutting edge of IT innovation, sales, engineering, operations, administration, and more, Zones is the place for you! All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status or on the basis of disability. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less

Posted 2 weeks ago

Apply

3.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Ingram Micro is a leading technology company for the global information technology ecosystem. With the ability to reach nearly 90% of the global population, we play a vital role in the worldwide IT sales channel, bringing products and services from technology manufacturers and cloud providers to business-to-business technology experts. Our market reach, diverse solutions and services portfolio, and digital platform Ingram Micro Xvantage™ set us apart About The Company Ingram Micro is an integral part of the technology ecosystems, helping our partners grow and thrive through the creation and delivery of Information Technology, Cloud solutions and Lifecycle services. With more than $54 billion in revenue and the ability to reach 90% of the global population, we are one of the world’s largest technology distributors, serving our partners through operations in 61 countries with 29,000 associates. Ingram Micro is the business behind the world’s brands, providing more ways to realize the promise of technology. We are on a path to transform Ingram Micro into a Digital Platform Business, based on experience and outcomes. Our strategy is intently focused on three main users (Customer, Associate, Vendor) of our Digital Platform, which will be connected via Data and Intelligence. These platforms together are the Ingram Micro Xvantage™ Job Description: We are seeking a motivated and detail-oriented Workday Analyst to join our team. The ideal candidate will have 2-5 years of experience in providing support for Workday Time Tracking , Absence and Scheduling modules. This role will involve handling both L1 (first line) and L2 (second line) support tasks, ensuring smooth operations and resolving user issues efficiently. Key Responsibilities: L1 Support: User Support: Provide first-line support to end users for issues related to Workday Time Tracking, Absence Management and Scheduling. Issue Logging: Record, categorize, and prioritize incoming support tickets using the company’s ticketing system. Basic Troubleshooting: Diagnose and resolve basic system issues, including login problems, navigation assistance, and data entry errors. Guidance and Training: Assist users by providing guidance on Workday features, best practices, and procedures. Escalation: Escalate complex issues to Workday Administrators when necessary, ensuring detailed documentation of the problem. L2 Support: Advanced Troubleshooting: Address and resolve more complex issues that have been escalated, including configuration problems, data discrepancies, and integration errors. System Configuration: Assist with the configuration and maintenance of Workday Time Tracking, Absence Management and Scheduling settings according to business requirements. Data Analysis: Perform data analysis and validation to identify and correct inconsistencies or errors in timesheets, schedules, or system-generated reports. Testing and Documentation: Participate in system testing during updates or new feature rollouts, ensuring thorough documentation of test results and issue resolutions. Collaboration: Work closely with the HRIS team to resolve system issues and improve overall system performance. Knowledge Sharing: Develop and update support documentation, FAQs, and training materials for both L1 support staff and end users. Qualifications: Education: Bachelor’s degree in information technology, Human Resources, Business Administration, or a related field preferred. Experience: 2-5 years of experience supporting Workday Time Tracking and Scheduling modules. Technical Skills: Familiarity with Workday configuration and administration. Basic understanding of HR processes and time management systems. Proficiency in ticketing systems and ITIL-based support workflows. Soft Skills: Strong analytical and problem-solving abilities. Excellent communication skills, both written and verbal. Ability to work independently and in a team environment. Detail-oriented with a focus on delivering high-quality support. Preferred Qualifications: Workday Pro Certification/Knowledge in Time Tracking, Absence Management or Scheduling. Experience in supporting other Workday modules such as Payroll. Knowledge of SQL or other data querying languages for data analysis. What We Offer: Competitive salary and benefits package. Opportunities for professional development and Workday certification. A collaborative work environment with a focus on innovation and continuous improvement. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Netradyne harnesses the power of Computer Vision and Edge Computing to revolutionize the modern-day transportation ecosystem. We are a leader in fleet safety solutions. With growth exceeding 4x year over year, our solution is quickly being recognized as a significant disruptive technology. Our team is growing, and we need forward-thinking, uncompromising, competitive team members to continue to facilitate our growth. Job Title: Technical Support Engineer - Automation Department: Customer Success Location: Bangalore Employment Type: Full-Time Job Summary: We are seeking a Data Analyst to join our Support team. The ideal candidate will be responsible for analyzing support data , developing automation solutions , and optimizing workflows to enhance the efficiency of our technical support operations . This role requires a combination of data analytics, automation, and problem-solving skills to improve customer support processes and drive informed decision-making. Key Responsibilities: Automation & Process Optimization: Design, develop, and maintain automation tools, scripts, and workflows to streamline repetitive support tasks. Automate processes such as data collection, ticket categorization, issue resolution, and reporting. Propose and implement automation strategies to improve response times and support efficiency. Data Analysis & Reporting: Collect, clean, and analyze support-related data to identify trends, bottlenecks, and areas for improvement. Generate insights and reports for stakeholders to support business decisions. Handle ad-hoc data requests, including retrieval, analysis, and custom reporting. Technical Support & Integration: Develop dashboards and monitoring tools to provide real-time visibility into support operations. Troubleshoot and resolve complex technical issues using automation and data-driven insights. Work with APIs (e.g., REST) to integrate data across systems such as Salesforce, Jira, and internal tools. Documentation & Collaboration: Maintain comprehensive documentation for automation workflows, data models, and dashboards. Collaborate with support engineers, product teams, and customer success managers to align on process improvements. Technical Skills: Strong SQL skills for database querying and management. Proficiency in Python and Bash scripting for automation. Hands-on experience with APIs (REST) and data integration techniques. Knowledge of UNIX/Linux commands and shell scripting. Experience with Jira for issue tracking and reporting. Familiarity with Salesforce for CRM and support case management. Exposure to DevOps tools like Git for version control. Experience with cloud platforms (AWS preferred) is a plus. Soft Skills: Strong problem-solving and analytical skills to identify and resolve complex issues effectively. Excellent communication and collaboration abilities to work seamlessly with cross-functional teams and stakeholders. Ability to thrive in a fast-paced, dynamic environment and perform under high-pressure situations. Qualifications: Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field. 3+ years of experience in software development, automation engineering, or a similar role. Proven experience in automating support processes and workflows. We are committed to an inclusive and diverse team. Netradyne is an equal-opportunity employer. We do not discriminate based on race, color, ethnicity, ancestry, national origin, religion, sex, gender, gender identity, gender expression, sexual orientation, age, disability, veteran status, genetic information, marital status, or any legally protected status. If there is a match between your experiences/skills and the Company's needs, we will contact you directly. Netradyne is an equal-opportunity employer. Applicants only - Recruiting agencies do not contact. Recruitment Fraud Alert! There has been an increase in fraud that targets job seekers. Scammers may present themselves to job seekers as Netradyne employees or recruiters. Please be aware that Netradyne does not request sensitive personal data from applicants via text/instant message or any unsecured method; does not promise any advance payment for work equipment set-up and does not use recruitment or job-sourcing agencies that charge candidates an advance fee of any kind. Official communication about your application will only come from emails ending in ‘@netradyne.com’ or ‘@us-greenhouse-mail.io’. Please review and apply to our available job openings at Netradyne.com/company/careers. For more information on avoiding and reporting scams, please visit the Federal Trade Commission's job scams website. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Our Marketing and Loyalty team is the strategic force behind United’s industry-leading brand and experience, supporting revenue growth by turning customers into lifelong United flyers. Our marketing communications, market research and brand teams drive travelers’ familiarity and engagement with the United brand. Our product, design and service teams bring the signature United brand to life in our clubs and onboard our aircraft, ensuring a seamless, premier experience. And when customers choose United again and again, that’s because the loyalty team has been hard at work crafting an award-winning program. Our loyalty team manages United MileagePlus®, building travel and lifestyle partnerships that customers can engage with every day, and works with our Star Alliance partners to ensure United can take you anywhere you want to go. Job Overview And Responsibilities United Airlines reaches out to customers and potential travelers via digital campaigns with new information, travel inspiration, personalized offers, promos, etc. The Digital Marketing & Personalized Offers team at IKC supports all such digital acquisition initiatives with insights to help strategize campaigns and analytics to help measure performance. We work closely with stakeholders in the US to bring these campaigns to life and continuously improve performance with learnings and actionable insights. Ensure alignment and prioritization with business objectives and initiatives – help teams make faster, smarter decisions Conduct exploratory analysis, identify opportunities, and proactively suggest initiatives to meet marketing objectives Assist in campaign planning, targeting and audience identification; measure campaign results and performance using data analysis Create content for and deliver presentations to United leadership and external stakeholders Own workstreams to deliver results, while leading other team members Ensure seamless stakeholder management and keep lines of communication open with all stakeholders Create, modify and automate reports and dashboards - take ownership of reporting structure and metrics, clearly and effectively communicate relevant information to decision makers using data visualization tools This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree or 4 years of relevant work experience 4+ years of experience in Analytics and working with analytical tools Proven comfort and an intellectual curiosity for working with very large data sets Experience in manipulating and analyzing complex, high-volume, high-dimensionality data from various sources to highlight patterns and relationships Proficiency in using database querying tools and writing complex queries and procedures using Teradata SQL and/ or Microsoft SQL Familiarity with one or more reporting tools – Spotfire/ Tableau Advanced level comfort with Microsoft Office, especially Excel and PowerPoint Ability to communicate analysis in a clear and precise manner High sense of ownership of work, and ability to lead a team Ability to work under time constraints Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's degree Bachelor’s Degree in a quantitative field like Math, Statistics, Analytics and/ or Business SQL/ Python/ R Visualization tools – Tableau/ Spotfire Understanding of digital acquisition channels Strong knowledge of either Python or R GGN00002080 Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

DESCRIPTION TOC (Transportation Operation Center) is the central command and control center for ‘Transportation Execution’ across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, TOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving TOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, TOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon’s ability to serve its customers on time. Purview of a Trans Ops Specialist A Trans Ops Specialist at TOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at TOC works across two verticals – Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Specialist provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Trans Ops Specialist should be able to ideate process improvements and should have the zeal to drive them to conclusion. Responsibilities Include, But Are Not Limited To: Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Ability to pull data from numerous databases (using Excel, Access, SQL and/or other data management systems) and to perform ad hoc reporting and analysis as needed is a plus. Develop and/or understand performance metrics to assist with driving business results. Ability to scope out business and functional requirements for the Amazon technology teams who create and enhance the software systems and tools are used by TOC. Must be able to quickly understand the business impact of the trends and make decisions that make sense based on available data. Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. A day in the life About The Hiring Group Job responsibilities A day in the life About The Hiring Group Job responsibilities BASIC QUALIFICATIONS Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics Experience with Excel PREFERRED QUALIFICATIONS Experience in SQL Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Telangana Job ID: A2880279 Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role The Business Intelligence analyst will support the ongoing design and development of dashboards, reports, and other analytics studies or needs. To be successful in the role you’ll need to be intellectually curious, detail-oriented, open to new ideas, and possess data skills and a strong aptitude for quantitative methods. The role requires strong SQL skills a wide experience using BI visualization tools like Tableau and PowerBI Your Role Accountabilities With the support of other analysis and technical teams, collect and analyze stakeholders’ requirements. Responsible for developing interactive and user-friendly dashboards and reports, partnering with UI/UX designers. Be experienced in BI tools like powerBi, Tableau, Looker, Microstrategy and Business Object and be capable and eager to learn new and other tools Be able to quickly shape data into reporting and analytics solutions Work with the Data and visualization platform team on reporting tools actualizations, understanding how new features can benefit our stakeholders in the future, and adapting existing dashboards and reports Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more Qualifications & Experiences 2+ years of experience working with BI tools or any data-specific role with a sound knowledge of database management, data modeling, business intelligence, SQL querying, data warehousing, and online analytical processing (OLAP) Skills in BI tools and BI systems, such as Power BI, SAP BO, Tableau, Looker, Microstrategy, etc., creating data-rich dashboards, implementing Row-level Security (RLS) in Power BI, writing DAX expressions, developing custom BI products with scripting and programming languages such as R, Python, etc. In-depth understanding and experience with BI stacks The ability to drill down on data and visualize it in the best possible way through charts, reports, or dashboards Self-motivated and eager to learn Ability to communicate with business as well as technical teams Strong client management skills Ability to learn and quickly respond to rapidly changing business environment Have an analytical and problem-solving mindset and approach Not Required But Preferred Experience BA/BS or MA/MS in design related field, or equivalent experience (relevant degree subjects include computer science, digital design, graphic design, web design, web technology) Understanding of software development architecture and technical aspects How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Job Description Job Title: .NET + Azure Developer (MVC, .NET Core, .NET Core Web API, Blazor, React) Location: Hyderabad/Bangalore Position Type: [Full-time/Contract] Job Summary We are seeking a skilled and versatile .NET + Azure Developer with expertise in .NET MVC, .NET Core, .NET Core Web API , and modern web technologies such as Blazor, React, and JavaScript . The ideal candidate should be proficient in C# , with a strong understanding of cloud technologies, particularly Azure PaaS services like Azure Functions and Azure Service Bus . Familiarity with incident management processes and ServiceNow is also essential. You will be responsible for developing, maintaining, and optimizing applications and services while working closely with cross-functional teams to meet business objectives. Key Responsibilities Develop and maintain applications using .NET MVC, .NET Core, .NET Core Web API, and Blazor. Implement front-end functionality using React, JavaScript, and other relevant technologies. Design and optimize relational databases using SQL and develop stored procedures. Utilize Azure PaaS services including Azure Functions and Azure Service Bus to build scalable cloud-based solutions. Develop RESTful APIs using .NET Core Web API and ensure integration with front-end applications. Write clean, maintainable, and efficient C# code. Collaborate with product managers, designers, and other developers to ensure software meets business requirements. Troubleshoot and resolve issues in production environments. Provide solutions for incident management, ensuring prompt resolution of incidents. Contribute to continuous improvement by identifying and suggesting enhancements to processes, tools, and systems. Adhere to best practices for coding, testing, and deployment. Participate in regular team meetings, reviews, and knowledge-sharing sessions. Required Qualifications Proficiency in C# and experience with .NET MVC, .NET Core, and .NET Core Web API. Strong knowledge of Blazor, React, and JavaScript. Experience in designing and querying SQL-based databases. Familiarity with Azure PaaS services, specifically Azure Functions and Azure Service Bus. Experience developing and consuming RESTful APIs. Experience with incident management processes, including familiarity with ServiceNow. Strong problem-solving skills and ability to work independently. Strong communication skills, with the ability to gather requirements and interact effectively with clients on a day-to-day basis. Ability to thrive in a fast-paced, team-oriented environment under a Managed Services Environment. Preferred Qualifications Familiarity with CI/CD pipelines and DevOps tools. Knowledge of Azure DevOps or similar platforms. Experience with containerization technologies like Docker. Knowledge of Agile methodologies. Education And Experience Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent work experience). 4+ years of hands-on experience as a .NET Developer or similar role. Why Join Us? PwC offers a dynamic, fast-paced work environment where innovation thrives. Opportunity to work with cutting-edge technologies in cloud computing and web development. Collaborative culture with opportunities for professional growth and career advancement. Competitive compensation package and benefits. This version highlights .NET Core Web API explicitly while ensuring the role reflects both .NET and Azure development expertise . Let me know if you’d like any further refinements! Show more Show less

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Job Title: Full Stack Engineer Experience Level: 5–9Years Employment Type: [Full-time] About The Role We’re looking for experienced Full Stack Engineers to join our growing team. You’ll play a key role in building highly interactive, AI-driven applications—from chat interfaces and operator-style workflows to dynamic dashboards and web/mobile apps. If you thrive in fast-paced environments, enjoy working across the stack, and are passionate about creating seamless user experiences, we’d love to hear from you. Key Responsibilities Design and develop end-to-end solutions across web and mobile platforms Create intelligent, AI-driven interfaces such as chat and dashboard experiences Build and maintain scalable and maintainable front-end and back-end architecture Collaborate with cross-functional teams on design, architecture, and product decisions Optimize applications for performance and scalability Write clean, efficient, and well-documented code Core Skillset Frontend React (Vanilla with Vite or frameworks like Next.js) TypeScript Component libraries (e.g., ShadCN/UI) Tailwind CSS React Server Components & SSR State Management: Zustand, Tanstack Query Schema Validation: Zod Forms & Data Viz: React Hook Form, ReCharts, Nivo Backend Node.js ORMs: Prisma, Drizzle Data Querying: Server Actions, Tanstack Query Databases: PostgreSQL, MongoDB DevOps & Cloud Docker & Kubernetes CI/CD with GitHub Actions Experience with cloud platforms: AWS, Azure, GCP Bonus Python scripting or backend experience Building chat interfaces or operator-style workflows Experience with real-time systems or complex dashboards Qualification - BE/BTech or MCA Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Primary Job Function This role will focus on implementing tools and strategies to analyze large amounts of data, identify trends, and convert information into business insights. The role will set up information formats and customized views for stakeholders across the company in various leadership, marketing and sales roles. Core Job Responsibilities Lead as a data-product owner translating business needs into data projects and data projects into business implications Partner with internal stakeholders---SFE/CRM/Marketing/ Ethical & Trade Sales/ MI/Finance to identify opportunities to implement data solutions to business problems Actively contribute to the business intelligence plan, BI environment and tools Build a strategic roadmap for Data & Analytics, including Data Science as a part of ANI India’s overall Customer/ Channel / Sales Force engagement/ Upstream-downstream strategy Build reports/models for forecasting, trending, Predictive analytics; Manage and execute ad-hoc reporting, dash boarding & analytics requirement Drive required data mining and present key strategic solutions/interpretations to business for real-time decision-making leveraging both traditional (e.g. data lake) as well as advanced (data science and AI) technologies & methodologies Promotes data-based storytelling abilities of summarizing and highlighting the points of analysis with effective visualization techniques through use of BI delivery platforms Work Experience 6+ years of experience Prior experience in Pharma / FMCG / FMHG will be an added advantage Strong knowledge of tools - querying languages (SQL, SAS, etc.), visualization (Tableau, Raw, etc.), and analytics (MS excel, Power BI, Adobe Analytics, etc.) Show more Show less

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Assistant Manager - Data Engineer Location: Andheri (Mumbai) Job Type: Full-Time Department: IT Position Overview: The Assistant Manager - Data Engineer will play a pivotal role in the design, development, and maintenance of data pipelines that ensure the efficiency, scalability, and reliability of our data infrastructure. This role will involve optimizing and automating ETL/ELT processes, as well as developing and refining databases, data warehouses, and data lakes. As an Assistant Manager, you will also mentor junior engineers and collaborate closely with cross-functional teams to support business goals and drive data excellence. Key Responsibilities: Data Pipeline Development: Design, build, and maintain efficient, scalable, and reliable data pipelines to support data analytics, reporting, and business intelligence initiatives. Database and Data Warehouse Management: Develop, optimize, and manage databases, data warehouses, and data lakes to enhance data accessibility and business decision-making. ETL/ELT Optimization: Automate and optimize data extraction, transformation, and loading (ETL/ELT) processes, ensuring efficient data flow and improved system performance. Data Modeling & Architecture: Develop and maintain data models to enable structured data storage, analysis, and reporting in alignment with business needs. Workflow Management Systems: Implement, optimize, and maintain workflow management tools (e.g., Apache Airflow, Talend) to streamline data engineering tasks and improve operational efficiency. Team Leadership & Mentorship: Guide, mentor, and support junior data engineers to enhance their skills and contribute effectively to projects. Collaboration with Cross-Functional Teams: Work closely with data scientists, analysts, business stakeholders, and IT teams to understand requirements and deliver solutions that align with business objectives. Performance Optimization: Continuously monitor and optimize data pipelines and storage solutions to ensure maximum performance and cost efficiency. Documentation & Process Improvement: Create and maintain documentation for data models, workflows, and systems. Contribute to the continuous improvement of data engineering practices. Qualifications: Educational Background: B.E., B.Tech., MCA Professional Experience: At least 5 to 7 years of experience in a data engineering or similar role, with hands-on experience in building and optimizing data pipelines, ETL processes, and database management. Technical Skills: Proficiency in Python and SQL for data processing, transformation, and querying. Experience with modern data warehousing solutions (e.g., Amazon Redshift, Snowflake, Google BigQuery, Azure Data Lake). Strong background in data modeling (dimensional, relational, star/snowflake schema). Hands-on experience with ETL tools (e.g., Apache Airflow, Talend, Informatica) and workflow management systems . Familiarity with cloud platforms (AWS, Azure, Google Cloud) and distributed data processing frameworks (e.g., Apache Spark). Data Visualization & Exploration: Familiarity with data visualization tools (e.g., Tableau, Power BI) for analysis and reporting. Leadership Skills: Demonstrated ability to manage and mentor a team of junior data engineers while fostering a collaborative and innovative work environment. Problem-Solving & Analytical Skills: Strong analytical and troubleshooting skills with the ability to optimize complex data systems for performance and scalability. Experience in Pharma/Healthcare (preferred but not required): Knowledge of the pharmaceutical industry and experience with data in regulated environments Desired Skills: Familiarity with industry-specific data standards and regulations. Experience working with machine learning models or data science pipelines is a plus. Strong communication skills with the ability to present technical data to non-technical stakeholders. Why Join Us: Impactful Work: Contribute to the pharmaceutical industry by improving data-driven decisions that impact public health. Career Growth: Opportunities to develop professionally in a fast-growing industry and company. Collaborative Environment: Work with a dynamic and talented team of engineers, data scientists, and business stakeholders. Competitive Benefits: Competitive salary, health benefits and more. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Description Opportunity: Do you want to be a part of the team that ensures Amazon keeps its “best price” promise across millions of products worldwide? Does the challenge of driving decisions in a dynamic environment excites you? Do you love solving complex business problems using technology? Are you seeking an environment where you can drive innovation? Are you a passionate self-starter? If the answer to the above questions is a resounding “YES”, read on! Job Description The Manager - Operations, will lead and manage a team of high performing individuals, responsible for end to end pricing operations management of retail categories in various Amazon international marketplaces. The job involves organizing, planning, prioritizing and scheduling work assignments, in addition to owning the production & quality KRAs for this team. You will manage performance measured on operations/business goals and SLA metrics. The role involves influencing peers and stakeholders in other functions to achieve the operational and business goals assigned. You will be the first point of contact to the retail teams of Amazon worldwide and must therefore be comfortable and confident liaising with remote teams. You will interface and drive agenda with the automation & technology teams to build scalable solutions. In addition, you will be responsible for ensuring the safety, security and integrity of Amazon’s systems and data. The successful candidate should have the ability to work at all levels of detail to accomplish team/organization goals. You will identify individual strengths of team members and contribute to talent advancement along with succession planning opportunities within the company. The ideal candidate actively seeks to understand Amazon’s core business values and initiatives, and translates those into everyday practices. Some of the key result areas include, but not limited to: Responsibility for meeting operational and business goals Driving appropriate data oriented analysis, adoption of technology solutions and process improvement projects to achieve operational and business goals. Managing stakeholder communication across multiple lines of business on operational milestones, process changes, escalations, etc. Ensuring high quality standards for interviewing and hiring employees at all levels of the organization. Executing specific people programs on coaching and development and team engagement. The role requires working in rotational shifts. Candidates applying for this role should be willing to work in Day / Night shifts. Basic Qualifications 2+ years of program or project management experience 2+ years of team management experience Knowledge of Microsoft Office products and applications, especially MS Excel, Word & PowerPoint at an advanced level Experience leading process improvements Bachelor's degree or equivalent Preferred Qualifications Knowledge of databases (querying and analyzing) such as SQL, MYSQL, and ETL Manager and working with large data sets Knowledge of Lean, Kaizen, Six Sigma concepts Experience in managing critical operational processes, with SLA responsibility Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2943533 Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chandigarh, India

On-site

Linkedin logo

Company Profile Oceaneering is a global provider of engineered services and products, primarily to the offshore energy industry. We develop products and services for use throughout the lifecycle of an offshore oilfield, from drilling to decommissioning. We operate the world's premier fleet of work class ROVs. Additionally, we are a leader in offshore oilfield maintenance services, umbilicals, subsea hardware, and tooling. We also use applied technology expertise to serve the defense, entertainment, material handling, aerospace, science, and renewable energy industries. Since year 2003, Oceaneering’s India Center has been an integral part of operations for Oceaneering’s robust product and service offerings across the globe. This center caters to diverse business needs, from oil and gas field infrastructure, subsea robotics to automated material handling & logistics. Our multidisciplinary team offers a wide spectrum of solutions, encompassing Subsea Engineering, Robotics, Automation, Control Systems, Software Development, Asset Integrity Management, Inspection, ROV operations, Field Network Management, Graphics Design & Animation, and more. In addition to these technical functions, Oceaneering India Center plays host to several crucial business functions, including Finance, Supply Chain Management (SCM), Information Technology (IT), Human Resources (HR), and Health, Safety & Environment (HSE). Our world class infrastructure in India includes modern offices, industry-leading tools and software, equipped labs, and beautiful campuses aligned with the future way of work. Oceaneering in India as well as globally has a great work culture that is flexible, transparent, and collaborative with great team synergy. At Oceaneering India Center, we take pride in “Solving the Unsolvable” by leveraging the diverse expertise within our team. Join us in shaping the future of technology and engineering solutions on a global scale. Position Summary Position Summary and Location Assist with building, maintaining, and optimizing data pipelines, ensuring data flows efficiently across systems. You will work closely with senior data engineers and data analysts to support data integration, ETL (Extract, Transform, Load) processes, and overall data infrastructure. Duties And Responsibilities Assist in designing, building, and maintaining scalable data pipelines to move data from various sources to the data warehouse or data lake. Help integrate data from various internal and external sources, including databases, APIs, and flat files, into centralized systems. Assist in data migration projects by writing scripts to move data between systems while ensuring data quality and integrity. Collaborate with the data quality team to ensure that data is accurate, consistent, and reliable. Implement basic data validation rules and participate in data quality checks to identify and fix data anomalies or errors. Assist in the management of databases, including tasks such as creating tables, writing SQL queries, and optimizing database performance. Support efforts to ensure efficient data storage, indexing, and retrieval for analytics and reporting purposes. Work closely with data analysts, business intelligence teams, and other stakeholders to understand data requirements and support their data needs. Provide data extracts, reports, and documentation as requested by business users and analysts. Assist in creating technical documentation for data models, pipelines, and integration processes. Supervisory Responsibilities This position has/does not have direct supervisory responsibilities. Reporting Relationship Sr. Manager, Data Estate – Business Intelligence Qualifications Bachelor’s degree in computer science, Information Systems, Engineering, Mathematics, or a related field. Relevant coursework or projects involving data management, databases, or data engineering is highly desirable. Knowledge, Skills, Abilities, And Other Characteristics Basic understanding of data structures, algorithms, and database management systems (SQL and NoSQL). Familiarity with SQL for querying databases and manipulating data. Some experience with scripting languages like Python, Java, or Scala for data processing tasks. Knowledge of data warehousing concepts and ETL processes is a plus. Exposure to cloud platforms (AWS, Azure, Google Cloud) or data tools (e.g., Apache Spark, Hadoop) is an advantage but not required. Strong problem-solving skills and the ability to troubleshoot data-related issues. Detail-oriented with a focus on data accuracy and quality. Preferred Qualifications Internship or hands-on project experience in data engineering or a related field is a plus. Experience working with data integration tools, cloud platforms, or big data technologies will be an added advantage. Familiarity with version control tools such as Git is beneficial. Closing Statement In addition, we make a priority of providing learning and development opportunities to enable employees to achieve their potential and take charge of their future. As well as developing employees in a specific role, we are committed to lifelong learning and ongoing education, including developing people skills and identifying future supervisors and managers. Every month, hundreds of employees are provided training, including HSE awareness, apprenticeships, entry and advanced level technical courses, management development seminars, and leadership and supervisory training. We have a strong ethos of internal promotion. We can offer long-term employment and career advancement across countries and continents. Working at Oceaneering means that if you have the ability, drive, and ambition to take charge of your future-you will be supported to do so and the possibilities are endless. Equal Opportunity/Inclusion Oceaneering’s policy is to provide equal employment opportunity to all applicants. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

India

Remote

Linkedin logo

🏢 Company: Natlov Technologies Pvt Ltd 🕒 Experience Required: 1–2 Years 🌐 Location: Remote (India-based candidates preferred) 🧠 About the Role: We are seeking passionate Data Engineers with hands-on experience in building scalable, distributed data systems and high-volume transaction applications. Join us to work with modern Big Data technologies and cloud platforms to architect, stream, and analyze data efficiently. 🛠️ What We’re Looking For (Experience: 1–2 Years): 🔹 Strong hands-on programming experience in Scala , Python , and other object-oriented languages 🔹 Experience in building distributed/scalable systems and high-volume transaction applications 🔹 Solid understanding of Big Data technologies: • Apache Spark (Structured & Real-Time Streaming) • Apache Kafka • Delta Lake 🔹 Experience with ETL workflows using MapReduce , Spark , and Hadoop 🔹 Proficiency in SQL querying and SQL Server Management Studio (SSMS) 🔹 Experience with Snowflake or Databricks 🔹 Dashboarding and reporting using Power BI 🔹 Familiarity with Kafka , Zookeeper , and YARN for ingestion and orchestration 🔹 Strong analytical and problem-solving skills 🔹 Energetic, motivated, and eager to learn and grow in a collaborative team environment 📍 Work Mode: Remote 📩 How to Apply: Send your resume to techhr@natlov.com Be a part of a passionate and forward-thinking team at Natlov Technologies Pvt Ltd , where we're redefining how data is architected, streamed, analyzed, and delivered. Let’s build the future of data together! 💼 #DataEngineer #BigData #ApacheSpark #Kafka #DeltaLake #SQL #PowerBI #Databricks #Snowflake #ETL #Python #Scala #SSMS #HiringNow #NatlovTechnologies #1to2YearsExperience #TechJobs #CareerOpportunity #RemoteJobs Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon Finance Operations Global Data Analytics (GDA) science team seeks a Sr. Data Scientist with the technical expertise and business intuition to invent the future of Accounts Payable at Amazon. As a key member of the science team, the Data Scientist will own high-visibility analyses, methodology, and algorithms in the Procure-to-Pay lifecycle to drive free cash flow improvements for Amazon Finance Operations. This is a unique opportunity in a growing data science and economics team with a charter to optimize operations and planning with complex trade-offs between customer experience, cash flow, and operational efficiencies in our payment processes. Key job responsibilities The Sr. Data Scientist's responsibilities include, but are not limited to the following points: Manage relationships with business and operational stakeholders and product managers to innovate on behalf of customers, develop novel applications data science methodologies, and partner with engineers and scientists to design, develop, and scale machine learning models. Define the vision for data science in the accounts payable space in partnership with process and technology leaders. Extract and analyze large amounts of data related to suppliers and associated business functions. Adapt statistical and machine learning methodologies for Finance Operations by developing and testing models, running computational experiments, and fine-tuning model parameters. Review and recommend improvements to science models and architecture as they relate to accounts payable process and tools. Use computational methods to identify relationships between data and business outcomes, define outliers and anomalies, and justify those outcomes to business customers. Communicate verbally and in writing to business customers with various levels of technical knowledge, educate stakeholders on our research, data science, and ML practice, and deliver actionable insights and recommendations. Serve as a point of contact for questions from business and operations leaders. Develop code to analyze data (SQL, PySpark, Scala, etc.) and build statistical and machine learning models and algorithms (Python, R, Scala, etc.). A day in the life As a successful data scientist in GDA’s Science team, you will dive deep on data from Amazon's payment practices and customer support functions, extract new assets, drive investigations and algorithm development, and interface with technical and non-technical customers. You will leverage your data science expertise and communication skills to pivot between delivering science solutions, translating knowledge of finance and operational processes into models, and communicating insights and recommendations to audiences of varying levels of technical sophistication in support of specific business questions, root cause analysis, planning, and innovation for the future. The role will work in a genuinely global environment, across various functional teams; with daily interaction across North America and Europe. About The Team Global Data Analytics (GDA) supports decisions in AR and AP. In close cooperation with our stakeholders, we agree and build uniform metrics; use data from a ‘single source of truth’; provide automated, self-service, standard reporting; and build predictive analytics. Our topmost ambition is to actively contribute to the improvement of Amazon's Free Cash Flow by value-adding analytics. Our success is built on users' trust in our data and the reliability of our analytics tools. GDA’s data scientists and economists further that mission with rigorous statistical, econometric, and ML models to compliment reporting and analysis developed by GDA’s analytical, BI, and Finance professionals. Basic Qualifications 5+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience 4+ years of data scientist experience 5+ years of data scientist or similar role involving data extraction, analysis, statistical modeling and communication experience Preferred Qualifications 3+ years of data visualization using AWS QuickSight, Tableau, R Shiny, etc. experience Experience managing data pipelines Experience as a leader and mentor on a data science team Master's degree in a quantitative field such as statistics, mathematics, data science, business analytics, economics, finance, engineering, or computer science Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A2942802 Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Linkedin logo

We are currently seeking a skilled Full Stack Developer with over 5 years of relevant experience to manage the data exchange between our servers and users. In this role, your primary focus will be developing server-side logic, ensuring high performance and responsiveness to front-end requests. We encourage every individual to cultivate leadership qualities and provide the autonomy to make impactful decisions that align with our organizational objectives. We operate in small, decentralized teams that foster independence and creative problem-solving. This approach allows individuals to hone their skills and contribute to crafting purpose-built solutions for our clients. **Responsibilities:** * Work on complex, custom-designed, scalable, multi-tiered software development projects. * Design and implement low-latency, high-availability, and performance-oriented applications. * Integrate user-facing elements developed by front-end developers with server-side logic. * Implement robust security measures and ensure data protection. * Take ownership of software quality and resolve any issues related to the solutions developed. * Engage in critical thinking to solve challenging problems, both technical and otherwise, collaborating with the team to implement effective solutions and continuously learn. **Requirements:** * Expertise in Python, with a strong understanding of Python web frameworks such as Django or Flask (depending on the specific technology stack). * Excellent command of programming languages, with a preference for PHP or Python, alongside proficiency in JavaScript, JQuery, and Angular-JS. * Strong proficiency in database querying, particularly with MySQL or similar systems. * Familiarity with Object Relational Mapper (ORM) libraries. * Ability to integrate multiple data sources and databases into a unified system. * Solid understanding of Python's threading limitations and multi-process architecture. * Basic knowledge of front-end technologies, including JavaScript, HTML5, and CSS3. * Understanding of accessibility and security compliance standards. * Knowledge of user authentication and authorization processes across multiple systems, servers, and environments. * Understanding of the fundamental design principles behind building scalable applications. * Awareness of the differences between various delivery platforms (e.g., mobile vs. desktop) and the ability to optimize output accordingly. * Capability to create database schemas that effectively represent and support business processes. * Strong unit testing and debugging skills. * Good communication skills and excellent problem-solving abilities. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

5 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

We are seeking an experienced and motivated Data Engineer to join our team. In this role, you will design, build, and maintain scalable data solutions to support critical business needs. You will work with distributed data platforms, cloud infrastructure, and modern data engineering tools to enable efficient data processing, storage, and analytics. The role includes participation in an on-call rotation to ensure the reliability and availability of our systems and pipelines Key Responsibilities Data Platform Development : Design, develop, and maintain data pipelines and workflows on distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, or Teradata. Cloud Integration: Build and optimize cloud-based solutions using AWS or GCP to process and store large-scale datasets. Workflow Orchestration: Design and manage workflows and data pipelines using Apache Airflow to ensure scalability, reliability, and maintainability. Containerization and Orchestration : Deploy and manage containerized applications using Kubernetes for efficient scalability and resource management. Event Streaming : Work with Kafka to implement reliable and scalable event streaming systems for real-time data processing. Programming and Automation : Write clean, efficient, and maintainable code in Python and SQL to automate data processing, transformation, and analytics tasks. Database Management : Design and optimize relational and non-relational databases to support high-performance querying and analytics. System Monitoring & Troubleshooting: Participate in the on-call rotation to monitor systems, address incidents, and ensure the reliability of production environments. Collaboration : Work closely with cross-functional teams, including data scientists, analysts, and product managers, to understand data requirements and deliver solutions that meet business objectives. Participate in code reviews, technical discussions, and team collaboration to deliver high-quality software solutions. This role includes participation in an on-call rotation to ensure the reliability and performance of production systems: Rotation Schedule : Weekly rotation beginning Tuesday at 9:00 PM PST through Monday at 9:00 AM PST. Responsibilities During On-Call : Monitor system health and respond to alerts promptly. Troubleshoot and resolve incidents to minimize downtime. Escalate issues as needed and document resolutions for future reference. Requirements: Primary Technologies: Big Query or other distributed data platform, for example, Big Data (Hadoop/EMR/DataProc), SnowFlake, Teradata, or Netezza, ASW, GCP, Kubernetes, Kafka Python, SQL Bachelor’s degree in computer science, Engineering, or a related field (or equivalent work experience). 3+ years of experience in data engineering or related roles. Hands-on experience with distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, Snowflake, or Teradata. Proficiency in Apache Airflow for building and orchestrating workflows and data pipelines. Proficiency in Python and SQL for data processing and analysis. Experience with cloud platforms like AWS or GCP, including building scalable solutions. Familiarity with Kubernetes for container orchestration. Knowledge of Kafka for event streaming and real-time data pipelines. Strong problem-solving skills and ability to troubleshoot complex systems. Excellent communication and collaboration skills to work effectively in a team environment. Preferred Familiarity with CI/CD pipelines for automated deployments. Knowledge of data governance, security, and compliance best practices. Experience with DevOps practices and tools. We have a global team of amazing individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast growth fintech, and multiple Silicon Valley startups. What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 (QMS), ISO 27001:2022 (ISMS), ISO 20000-1:2018 (ITSM) and ISO 14001:2015 (EMS) Certified. We have a vibrant culture of learning via collaboration and making workplace fun. People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves. To know more about Confiz Limited, visit https://www.linkedin.com/company/confiz/

Posted 2 weeks ago

Apply

60.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Who We Are BCG pioneered strategy consulting more than 60 years ago, and we continue to innovate and redefine the industry. We offer multiple career paths for the world’s best talent to have a real impact on business and society. As part of our team, you will benefit from the breadth and diversity of what we are doing today and where we are headed next. We count on your authenticity, exceptional work, and strong integrity. In return we are committed to supporting you in discovering the most fulfilling career journey possible—and unlocking your potential to advance the world What You'll Do As a People Analyst, you’ll support evidenced-based decision making. You’ll provide analytical input to support the Global People team and work with stakeholders across BCG. You’ll add a quantitative perspective to discussions on new and existing HR processes and procedures You will apply an analytics mindset and empower internal clients with dashboards, data, and reports to help improve processes, and solve people related challenges, to provide thought leadership on the complete cycle of talent analytics, from sourcing the candidates to managing attrition Key Responsibilities Develop, design, and manage advanced Tableau and Power BI dashboards that integrate data from diverse sources Use SQL to query databases and retrieve relevant data for analysis and reporting Perform data extraction, transformation, and loading (ETL) to create efficient and scalable data models Thought partner with internal stakeholders on various people related challenges, by developing domain expertise Ensure data accuracy and consistency through rigorous testing and quality checks Collaborate with cross-functional teams to gather requirements and understand data sources What You'll Bring Undergraduate degree, preferably in an engineering or other technology-related fields, with high academic achievement required; advanced degree preferred MUST HAVES: 3-5 years of full time Tableau and Power BI dashboard development, data modeling, and SQL language expertise Advanced Tableau experience, including Tableau server management, level of detail calculations, built custom charts, hyper data source, Javascript APIs Advanced Power BI skills, with a focus on dashboard development including DAX calculations, custom visualizations, and Power Query Strong understanding of UX/UI principles for creating intuitive and visually impactful Tableau dashboards Knowledge in SQL for querying databases, optimizing data retrieval, and supporting data-driven decision-making Basic knowledge of Microsoft Excel, with skills in data manipulation, including sorting, filtering, and using formulas to analyze and organize complex data sets Background in HR data analysis and HR domain knowledge is preferred, however not mandatory Deep interest and aptitude in data, metrics, and analysis Who You'll Work With As part of the People analytics team, you will modernize HR platforms, capabilities & engagement, automate/digitize core HR processes and operations and enable greater efficiency. You will collaborate with the global people teams and colleagues across BCG to manage the life cycle of all BCG employees. The People Management Team (PMT) is comprised of several centers of expertise including HR Operations, People Analytics, Career Development, Learning & Development, Talent Acquisition & Branding, Compensation, and Mobility. Our centers of expertise work together to build out new teams and capabilities by sourcing, acquiring and retaining the best, diverse talent for BCG. We develop talent and capabilities, while enhancing managers’ effectiveness, and building affiliation and engagement in our global offices. The PMT also harmonizes process efficiencies, automation, and global standardization. Through analytics and digitalization, we are always looking to expand our PMT capabilities and coverage. Additional info You are good at- Providing analytical support in metrics, reporting, and dashboard development Leading technical aspects of a large project with minimal supervision Generating insights from large and complex datasets, and understanding the nuances and inconsistencies in data Ability to multi-task and operate effectively in a fast-paced and customer-oriented environment; ability to manage multiple stakeholders in a matrix organization Communicating and presenting technical details to non-technical stakeholders Strong interpersonal skills, who showcases credibility and excels in a collaborative setting Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less

Posted 2 weeks ago

Apply

Exploring Querying Jobs in India

The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.

Career Path

In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager

Related Skills

Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts

Interview Questions

  • What is the difference between SQL and NoSQL databases? (basic)
  • Explain the purpose of the GROUP BY clause in SQL. (basic)
  • How do you optimize a slow-performing SQL query? (medium)
  • What are the different types of joins in SQL? (medium)
  • Can you explain the concept of ACID properties in database management? (medium)
  • Write a query to find the second-highest salary in a table. (advanced)
  • What is a subquery in SQL? Provide an example. (advanced)
  • Explain the difference between HAVING and WHERE clauses in SQL. (advanced)

Closing Remark

As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies