Home
Jobs

1697 Querying Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: AFC Transaction Monitoring - Business Functional Analyst Corporate Title: AVP Location: Pune, India Role Description You will be joining the Anti-Financial Crime (AFC) Technology team and will work as part of a multi-skilled agile squad, specializing in understanding, enhancing and expanding the datasets required in Transaction Monitoring to identify Money Laundering or Terrorism Financing. You will have the opportunity to work on challenging problems, analyze large complex datasets and develop a deep understanding of the Transaction Monitoring functions and dataflows. As a key member of our team, you will play a crucial role in ensuring the integrity, accuracy, and completeness of the data required to run our transaction monitoring systems. Your expertise in data analysis, management, and technology will be instrumental to understand and leverage large datasets, ensuring compliance with regulatory requirements, and improving the quality of the Transaction Monitoring alerts. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities As AVP, your role will include responsibilities, such as: Collaborate with stakeholders to gather, analyze, and document requirements, ensuring that the requirements are clear, comprehensive, and aligned with business objectives. Work closely with developers and architects to design and implement solutions that meet business needs whilst ensuring that solutions are scalable, supportable and sustainable. Thinking analytically, with systematic and logical approach to solving complex problems with a and high attention to detail Create and maintain comprehensive documentation, including requirements, process flows, and user guides. Ensure that documentation is accurate, up-to-date, and accessible to relevant stakeholders Be the voice of the customer when interacting with the development teams to ensure delivery is aligned to business requirements and expectations, Leading and collaborating across teams do run with and deliver multiple projects simultaneously. Employing data querying and analytical techniques to support the understanding of data and creation of reports and actionable intelligence. Your Skills And Experience Advanced analytical and problem-solving experience and ability to independently identify issues grasp new concepts, provide insights and solutions and oversee their delivery. Advanced knowledge of methods and tooling for the business functional analysis Proficiency in data analysis tools and programming languages (e.g., Python, SQL, R), ideally in a Cloud or Big Data environment, Understanding of the payments industry, payments systems, data and protocols as well as SWIFT messaging Excellent communication skills, written and oral, and strong experience authoring documents that will support development work. Able to demonstrate ability to perform business analysis in consultation with the business and product management and produce BRDs and technical specifications. Hands-on project experience of handling complex business requirement (i.e. Data Mapping, Data Modelling, Data Migration, and System integration) through to system level functional specifications Strong planning and highly organized with ability to prioritize key deliverables across several projects/workstreams. Self-motivated and flexibility to work autonomously coupled with ability to work in virtual teams and matrix/global organizations Ability to influence and motivate other team members and stakeholders through strong dialogue, facilitation and persuasiveness. Able to act as a point of escalation on business knowledge/value/requirement for team members. How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 1 week ago

Apply

30.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Beghou Consulting Careers 2025 For over 30 years, Beghou Consulting has been a trusted adviser to life science firms. We combine our strategic consulting services with proprietary technology to develop custom, data-driven solutions that allow life sciences companies to take their commercial operations to new heights. We are dedicated to client service and offer a full suite of consulting and technology services, all rooted in advanced analytics, to enhance commercial operations and boost sales performance. Candidates Also Search: Software Engineer Jobs Beghou Consulting Careers 2025 Details Company Name Beghou Consulting Job Role Software Developer Job Type Full Time Job Location Hyderabad Education BE/ B.Tech Career Level 0 – 1 Years Salary Not Mentioned Company Website www.beghouconsulting.com Job Description For Beghou Consulting Careers 2025 Candidates Also Search: Fresher Jobs As a Software Developer in the Delivery team, you will work on implementing and customizing Mainsail™ modules to meet client-specific business needs. You will contribute to UI configuration, functional testing, and client data integration tasks. You’ll collaborate with consultants and senior developers to translate requirements into reliable, scalable delivery components. We’ll Trust You To Develop and configure front-end components using HTML, CSS, and templating standards Assist with integrating dynamic elements and adjusting UI behaviour using JavaScript Write and execute SQL scripts for data preparation and validation tasks Participate in internal requirement reviews and implementation walkthroughs Troubleshoot issues during testing and support defect resolution Maintain structured documentation for delivery tasks and technical references Candidates Also Search: BE/ B.Tech Jobs You’ll Need To Have Working knowledge of HTML and CSS (capable of independent modifications) Basic JavaScript proficiency (able to follow and adjust existing logic) SQL for querying and transforming data Familiarity with .NET-based API structures and request/response patterns is a plus Attention to detail and structured problem-solving Collaboration with multi-functional delivery teams Ability to work with technical documentation and templates Beghou Consulting Careers 2025 Application Process DOUBLE CLICK TO APPLY ONLINE ! We wish you the best of luck in your Beghou Consulting Careers 2025 . May your talents shine, and may you find the perfect opportunity that not only meets your professional goals but also brings joy to your everyday work. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design, optimize, secure, and administer databases Develop and maintain data warehousing solutions Implement ETL processes for data integration and provisioning Manage data stores within Platform-as-a-Service (PaaS) and cloud solutions Provide sizing and configuration assistance for data storage systems Analyze business processes and identify opportunities for data optimization Manage relationships with software and hardware vendors Design schemas and write SQL scripts for data analytics and applications Ensure data quality and governance standards are met Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Technical Skills SQL: Proficiency in SQL for database management and querying Python: Solid skills in Python for data manipulation and scripting Big Data Technologies: Experience with Hadoop and Spark ETL Tools: Proficiency in ETL tools like SSIS Cloud Platforms: Knowledge of AWS, Azure, and Google Cloud Data Warehousing: Experience with data warehousing solutions Data Visualization: Skills in Power BI and Tableau DevOps Tools: Familiarity with Jenkins, GitHub, and Azure DevOps Unix/Linux: Proficiency in Unix/Linux operating systems Snowflake: Experience with Snowflake for data warehousing Preferred Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or related field Certifications in data engineering or related fields Proven experience in data engineering roles with enterprise-scale impact Experience managing data engineering projects end-to-end Solid analytical and problem-solving skills Excellent communication skills for technical and non-technical audiences Knowledge of data governance and quality standards At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #Gen Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role: Staff Business Intelligence Analyst (Tableau,SQL) Location : Bangalore OR Mumbai We are seeking a highly skilled Senior Business Intelligence (BI) Analyst to drive data-driven decision-making across our organization. This role requires a strategic thinker with deep expertise in BI tools, data visualization, and advanced analytics. The ideal candidate will work closely with stakeholders to transform raw data into actionable insights that enhance business performance and operational efficiency. Key Responsibilities: Develop, maintain, and optimize BI dashboards and reports to support business decision-making. Extract, analyze, and interpret complex datasets from multiple sources to identify trends and opportunities. Collaborate with cross-functional teams to define business intelligence requirements and deliver insightful solutions. Present key findings and recommendations to senior leadership and stakeholders clearly and compellingly. Ensure data accuracy, consistency, and governance by implementing best practices in data management. Conduct advanced analytics to drive strategic initiatives. Mentor and support junior BI analysts to enhance overall team capability. Required Qualifications: Bachelor’s degree in Business Analytics, Computer Science, or a related field. 7+ years of experience in business intelligence, data analysis, or a similar role. Proficiency in BI tools (e.g., Tableau, Power BI) and SQL for querying and data manipulation. Experience with Tableau is preferred, but candidates with expertise in other BI tools will also be considered. An understanding of ETL processes, data warehousing, and database management. Experience with Google BigQuery is a bonus. Excellent problem-solving skills and ability to translate data into actionable business strategies. Strong communication skills, with the ability to present insights to both technical and non-technical stakeholders. Experience working with cloud platforms (e.g., AWS, Azure) is a plus. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Country India Location: Building No 12D, Floor 5, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India Role: Business Analyst Location: Hyderabad, India Full/ Part-time: Full-time Build a career with confidence Carrier is a leading provider of heating, ventilating, air conditioning and refrigeration systems, building controls and automation, and fire and security systems leading to safer, smarter, sustainable, and high-performance buildings. Carrier is on a mission to make modern life possible by delivering groundbreaking systems and services that help homes, buildings and shipping become safer, smarter, and more sustainable. Our teams exceed the expectations of our customers by anticipating industry trends, working tirelessly to master and revolutionize them. About The Role Experienced General Finance Management professional, who implements financial plans, analyzes financial processes and standards, and establishes financial indicators to forecast performance measures. Develops relationships with external financial consultants and advisors and provides technical advice to functional managers on financial matters. Key Responsibilities If you thrive in a fast-paced environment and are looking for an opportunity to develop your Analytics career in Shared Services, then we have a great opportunity for you. We are seeking a motivated Business Analyst to support the Global Business Services organization. Specific Responsibilities For This Position Include Manage end-to-end deployment of reporting structures, including data collection, transformation, visualization, and distribution, ensuring alignment with business needs. Manage implementations of business intelligence dashboards using BI tools, ensuring that data is presented in a meaningful and visually appealing manner. Collaborate with Global Process Owners from the Finance team to gather requirements, design KPI visualizations, and ensure data accuracy and quality. Deploy integrated reporting solutions, through MS tools such as Power Query and Power Automate workflows, to streamline data collection, processing, and dissemination. Collaborate with IT teams to establish new database connections, optimize SQL queries, and ensure smooth data integration from various sources. Conduct thorough data analysis, including forecast and projections, to identify trends, anomalies, and areas for process improvement. Provide recommendations to team leaders based on data insights, enabling informed decision-making and driving operational efficiencies. Support Continuous Improvement initiatives, including Kaizen events, by setting up performance measurement structures and tracking progress. Stay updated with emerging trends in business intelligence, data visualization, and project management to continually enhance reporting and analytical capabilities. Education / Certifications Bachelor’s degree in finance or accounting required Requirements 7+ years of experience in Finance processes, preferably in a Shared Service environment Proven experience in an analytical position; proficiently using finance concepts in to deliver business findings to the stakeholders. Proven track record of successfully managing projects related to KPI definition, measurement, and deployment. Experience in designing and developing BI dashboards using tools like Power BI, Tableau, or similar platforms. Strong background in data integration, database management, and SQL querying for efficient data retrieval and analysis. Proficiency in process improvement methodologies, such as Lean or Six Sigma, and the ability to drive continuous improvement initiatives. Proven analytical and quantitative skills, ability to use data and metrics to set-up and find data trends Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Make yourself a priority with flexible schedules, parental leave and our holiday purchase scheme Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way . Join us and make a difference. Apply Now! #cbsfinance Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Sonipat, Haryana, India

On-site

Linkedin logo

ABOUT US: Newton School and Rishihood University have formed a powerful partnership to drive transformation in the world of technology and education. Newton School, dedicated to bridging the employability gap, has partnered with Rishihood University, India's first impact university. Together, we will be revolutionizing education, empowering students, and shaping the future of technology. With a team of experienced professionals and renowned investors, we are united in our mission to solve the employability challenge and make a lasting impact on society. Job Summary Are you passionate about computer science? Join us as a Subject Matter Expert in the Computer Science department at Sonipat, Delhi NCR. We are seeking an experienced professional to deliver high-quality lectures, design course content, mentor students, and take lab classes, ensuring their success in the tech field. Key Responsibilities Develop course materials and curriculum. Collaborate with team members to improve the learning experience. Provide guidance and support to students in understanding the subjects or the programming languages. Take ownership of labs and guide students in creating and developing projects. Qualifications M.Tech in Computer Science or related field. Experience in teaching or mentoring students is preferred. Excellent communication and presentation skills. Experience with industry-standard tools and technologies. Experience with DSA OR MERN, OR DBMS, any one or more are acceptable. Requirements Strong expertise in topics related to In-depth knowledge of Database Management Systems - Relational Database Management Systems (RDBMS), Querying in SQL, Normalization, Indexing, Transactions, Query Optimization, Data Modeling, Database Design, ACID properties, NoSQL Databases. Preferred frontend technologies: HTML, CSS, JavaScript, React.js. Strong expertise in topics related to Advanced Data Structures and Algorithms -Arrays, Linked Lists, Stacks, Queues, Trees, Graphs, Sorting Algorithms, Searching Algorithms, Dynamic Programming, Algorithm Analysis, Recursion Any of the above-mentioned technologies is accepted. Key Responsibility Areas Course Development and Planning Owning and running labs Cross-Functional Team Collaboration Owning up to Academic Success Tutoring and Student Support Stakeholder Management Willingness to work in Sonipat, Delhi NCR. Perks And Benefits Market Competitive Salaries Research Opportunities and industry collaborations. Inculcate research and innovation in students, and help Rishihood University to do cutting-edge work in the computer science department. State-of-the-Art Facilities in Labs and Classrooms. Show more Show less

Posted 1 week ago

Apply

7.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At Lilly, we unite caring with discovery to make life better for people around the world. We are a global healthcare leader headquartered in Indianapolis, Indiana. Our employees around the world work to discover and bring life-changing medicines to those who need them, improve the understanding and management of disease, and give back to our communities through philanthropy and volunteerism. We give our best effort to our work, and we put people first. We’re looking for people who are determined to make life better for people around the world. Position Description The purpose of this role is to serve as a trusted partner with Business Unit Medical Affairs and Internal Global Medical Affairs Organization (GMAO) teams to lead creation of high-quality data reporting and visualization support that can drive better customer experience and business impact. This role will champion our self-service reporting strategy and would play a key role in helping us automate and create scalable frameworks for our reporting and analytics. We are looking for a hands-on person who can help expand our analytics & reporting capabilities and drive business-critical initiatives. Key Objectives/Deliverables Know Lilly TA business and our internal business partners. Build and exhibit deep expertise on available data sets and supports data enabled decision making by developing data lakes, insights, reporting & visualization Execute and monitor operations tasks to ensure timely availability of data in a reporting / dashboard structure to the business. Perform thorough data validations to ensure data quality Respond to queries from internal stakeholders Consistently meet operations SLAs Perform incident resolution and root cause analysis to support data and reporting operations Consistent delivery of high quality, timely and insightful reports to enable stakeholders and senior leadership to take key decisions Develop and publish regularly, different execution dashboard as per the business roadmap & requirements Descriptive analytics and visualization to provide data-based insights on planning, execution and outcomes Demonstrate deep understanding of information and material flows, processes, procedures, systems, and methods Demonstrate understanding of internal business partners’ people, processes, and technology Partner and collaborate with other site-level teams to identify synergies and implementation of best practices Technical Skills Expertise in writing and debugging efficient SQL queries. Strong experience in data visualization tools - Power BI or Tableau (Power BI preferred) – Should be able to independently design and develop dashboards as per business requirement. Advanced MS-office skills (MS-Excel and MS-PowerPoint) Coding: SQL mandatory and one of R, Python would be good to have Analytical Skills Experience in business analytics Data cleaning and preparation skill (database querying, descriptive statistics) Problem solving skills and lateral thinking ability and an eye for detail Educational Requirements Bachelor’s or Master’s degree in sciences or quantitative discipline i.e. Finance, Econometrics, Statistics, Engineering or Computer Sciences Additional Preferences At least 7-9 years of evolving experience in data management, pharma market intelligence, performance reporting/visualization and/or descriptive analytics for leadership, with demonstrated results in understanding, structuring, and making sense of unfamiliar and messy datasets Experience with project management software (e.g., Wrike, JIRA, Adobe Workfront) and proficiency in a variety of PC applications and multifunctional diagramming tools including Microsoft Project, Visio, Lucid Chart etc. Strong work ethic and personal motivation Interpersonal and communication skills with ability to work across time zones. Strong stakeholder management skills Ability to operate effectively in an international matrix environment. Strong team player who is dynamic and result oriented Proven planning and organizational skills Proven ability to manage multiple projects at a time with flexibility to adjust quickly and effectively to frequent change and altered priorities Product Launch experience Demonstrated enthusiasm and the ability to work under pressure to meet deadlines Lilly is dedicated to helping individuals with disabilities to actively engage in the workforce, ensuring equal opportunities when vying for positions. If you require accommodation to submit a resume for a position at Lilly, please complete the accommodation request form (https://careers.lilly.com/us/en/workplace-accommodation) for further assistance. Please note this is for individuals to request an accommodation as part of the application process and any other correspondence will not receive a response. Lilly does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status. #WeAreLilly Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

This job is with Amazon, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Description NOC (Network Operation Center) is the central command and control center for 'Transportation Execution' across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving NOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, NOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon's ability to serve its customers on time. Purview Of a Trans Ops Representative A Trans Ops Representative at NOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at NOC works across two verticals - Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Representative provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Key job responsibilities Trans Ops Representative should be able to ideate process improvements and should have the zeal to drive them to conclusion. Responsibilities Include, But Are Not Limited To Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. Basic Qualifications Bachelor's degree 2)12-24 months of work experience. 3)Good communication skills - Trans Ops Representative will be facilitating flow of information between external 4)Proficiency in Excel (pivot tables, vlookups) 5)Demonstrated ability to work in a team in a very dynamic environment Preferred Qualifications Graduate with Bachelor's degree Good logical skills Good communication skills - Trans Ops Representative will be facilitating flow of information between different teams Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

This job is with Amazon, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Description Job Description for Transportation Representative - NOC NOC Overview NOC (Network Operation Center) is the central command and control center for 'Transportation Execution' across the Amazon Supply Chain network supporting multiple geographies like NA, India and EU. It ensures hassle free, timely pick-up and delivery of freight from vendors to Amazon fulfillment centers (FC) and from Amazon FCs to carrier hubs. In case of any exceptions, NOC steps in to resolve the issue and keeps all the stakeholders informed on the proceedings. Along with this tactical problem solving NOC is also charged with understanding trends in network exceptions and then automating processes or proposing process changes to streamline operations. This second aspect involves network monitoring and significant analysis of network data. Overall, NOC plays a critical role in ensuring the smooth functioning of Amazon transportation and thereby has a direct impact on Amazon's ability to serve its customers on time. Purview Of a Trans Ops Representative A Trans Ops Representative at NOC facilitates flow of information between different stakeholders (Trans Carriers/Hubs/Warehouses) and resolves any potential issues that impacts customer experience and business continuity. Trans Ops Specialist at NOC works across two verticals - Inbound and Outbound operations. Inbound Operations deals with Vendor/Carrier/FC relationship, ensuring that the freight is picked-up on time and is delivered at FC as per the given appointment. Trans Ops Specialist on Inbound addresses any potential issues occurring during the lifecycle of pick-up to delivery. Outbound Operations deals with FC/Carrier/Carrier Hub relationship, ensuring that the truck leaves the FC in order to delivery customer orders as per promise. Trans Ops Specialist on Outbound addresses any potential issues occurring during the lifecycle of freight leaving the FC and reaching customer premises. A Trans Ops Representative provides timely resolution to the issue in hand by researching and querying internal tools and by taking real-time decisions. An ideal candidate should be able to understand the requirements/be able to analyze data and notice trends and be able to drive Customer Experience without compromising on time. The candidate should have the basic understanding of Logistics and should be able to communicate clearly in the written and oral form. Key job responsibilities Trans Ops Representative should be able to ideate process improvements and should have the zeal to drive them to conclusion. Responsibilities Include, But Are Not Limited To Communication with external customers (Carriers, Vendors/Suppliers) and internal customers (Retail, Finance, Software Support, Fulfillment Centers) Must be able to systematically escalate problems or variance in the information and data to the relevant owners and teams and follow through on the resolutions to ensure they are delivered. Excellent communication, both verbal and written as one may be required to create a narrative outlining weekly findings and the variances to goals, and present these finding in a review forum. Providing real-time customer experience by working in 24*7 operating environment. Basic Qualifications Bachelor's degree 2)12-24 months of work experience. Good communication skills - Trans Ops Representative will be facilitating flow of information between external Proficiency in Excel (pivot tables, vlookups) Demonstrated ability to work in a team in a very dynamic environment Preferred Qualifications Graduate with Bachelor's degree Good logical skills Good communication skills - Trans Ops Representative will be facilitating flow of information between different teams Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

For an international project in Chennai, we are urgently looking for a Full Remote (Senior) Databricks Developer with +5 years of experience. We are looking for a motivated contractor. Candidates need to be fluent in English. Tasks and responsibilities: Collaborate with data architects and analysts to design robust data pipelines on the Databricks platform; Develop scalable and efficient ETL processes to ingest, transform, and store large volumes of data; Ensure data quality and integrity through the implementation of validation and cleansing processes. Optimize data pipelines for performance, scalability, and cost-effectiveness; Monitor and troubleshoot data pipeline issues to ensure seamless data flow and processing; Implement best practices for data storage, retrieval, and processing to enhance system performance; Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs; Document data pipeline designs, processes, and configurations for future reference and knowledge sharing; Provide technical guidance and support to team members and stakeholders on Databricks-related features; Profile: Bachelor or Master degree; +5 years of experience in Data Science roles; Azure Databricks for developing, managing, and optimizing big data solutions on the Azure platform; Programming skills in Python for writing data processing scripts and working with machine learning models; Advanced SQL skills for querying and manipulating data within Databricks and integrating with other Azure services; Azure Data Lake Storage (ADLS) for storing and accessing large volumes of structured and unstructured data and ensuring data reliability and consistency in Databricks; Power BI Integration for creating interactive data visualizations and dashboards; PowerApps Integration for building custom business applications that leverage big data insights; Data engineering, including ETL processes and data pipeline development; Azure DevOps for implementing CI/CD pipelines and managing code repositories; Machine Learning concepts and tools within Databricks for developing predictive models; Azure Synapse Analytics for integrating big data and data warehousing solutions; Azure Functions for creating serverless computing solutions that integrate with Databricks; Databricks REST API for automating tasks and integrating with other systems; Azure Active Directory for managing user access and security within Azure Databricks; Azure Blob Storage for storing and retrieving large amounts of unstructured data; Azure Monitor for tracking and analyzing the performance of Databricks applications; Have familiarity with data governance practices for ensuring compliance and data quality in big data projects; Fluent in English; Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Analyst - Product & Marketing Remote (India based) | 📅 6-month Contract (with potential extension up to 12 months) We are seeking a skilled and analytical Data Analyst to support our Product and Marketing teams with data-driven insights and performance analysis. This role is ideal for someone with a strong understanding of digital marketing metrics, user behaviour, and product analytics. Key Responsibilities: Develop and maintain dashboards in Looker Studio and Google Analytics 4 (GA4) Use BigQuery to extract, transform, and load (ETL) data for analytical reporting Analyse attribution models and user journeys across various marketing channels Provide insights to optimise campaign performance and conversion rates Collaborate with product, design, and development teams to inform UX improvements and feature prioritisation Identify opportunities to improve user engagement and retention through data analysis Write and maintain SQL queries to support insights and dashboard creation Work closely with data engineers to ensure data integrity and pipeline efficiency Requirements: Proven experience with Looker Studio , GA4 , and BigQuery Strong understanding of marketing funnels , attribution models, and user behaviour analytics Ability to communicate insights effectively to non-technical stakeholders Proficiency in SQL for data querying and transformation Bonus: Experience with A/B testing or product analytics tools such as Mixpanel or Amplitude This is a contract position for 6 months , with a possibility of extension up to 12 months, offering the opportunity to work remotely with a collaborative, insight-driven team. If you have a passion for data and its role in shaping both marketing strategies and product experiences, we'd like to hear from you. Hit apply! Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

It's fun to work at a company where people truly believe in what they are doing! Job Description: Summary: This position involves developing backend applications using C# and SQL Server, as well as developing and migrating PowerApps model-driven and canvas-driven apps for high-volume, mission-critical environments. The role will primarily focus on hands-on coding with these technologies. The ideal candidate should demonstrate strong organizational skills, effectively manage priorities, and meet deadlines. Additionally, the role requires collaboration within a team and acting as a resource for other software engineers. The candidate will also serve as a liaison between the department manager, analysts, and the business team Roles & Responsibilities: Take ownership and assume end to end responsibility including design and unit test code Able to specify, plan and deliver new or updated application functionality Seasoned in understanding applications, data interfaces and hosting environments As part of the Applications maintenance team support the change management process on existing systems and implement new solutions in line with project implementations Experience & Exposure Requirements: Strong experience in developing .NET Core Web API (MUST) Strong experience in Core Entity Framework (MUST) Strong experience in C# coding, class libraries and Web services (OData) (MUST) Experience in implementing design patterns. (MUST) Experience in implementing Microservices with Messaging Event Queue (MUST) Knowledge on Azure App services, Functions and Managed Identities (Good to Have) Experience in developing/migrating PowerApps model and canvas driven apps (Good to Have) Good implementation knowledge on JavaScript/jQuery (Good to Have) Automating business processes with Microsoft Power Automate. (Good to Have) Experience with API or rest services integrations with PowerApps. (Good to Have) Good knowledge on design low-code portals for external users using Microsoft Power Apps (Good to Have) Experience of (MS-SQL Server 2012 & above) database development, querying, analysis, writing and reviewing stored procedures, functions and views using Transact SQL (Level of knowledge - Intermediate) (MUST) Ability to understand the database logical and physical schema and extend the existing schema. Required advance SQL knowledge to troubleshoot the issues (MUST) Experience with code refactoring Should have worked in Agile Scrum teams and is aware of Agile Scrum Delivery principles Should have excellent communication, presentation and interpretation skills Should be good at analysing issues / situations and reasoning solution proposals Qualification Requirements: B-Tech/MCA If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! It is Epiq’s policy to comply with all applicable equal employment opportunity laws by making all employment decisions without unlawful regard or consideration of any individual’s race, religion, ethnicity, color, sex, sexual orientation, gender identity or expressions, transgender status, sexual and other reproductive health decisions, marital status, age, national origin, genetic information, ancestry, citizenship, physical or mental disability, veteran or family status or any other basis protected by applicable national, federal, state, provincial or local law. Epiq’s policy prohibits unlawful discrimination based on any of these impermissible bases, as well as any bases or grounds protected by applicable law in each jurisdiction. In addition Epiq will take affirmative action for minorities, women, covered veterans and individuals with disabilities. If you need assistance or an accommodation during the application process because of a disability, it is available upon request. Epiq is pleased to provide such assistance and no applicant will be penalized as a result of such a request. Pursuant to relevant law, where applicable, Epiq will consider for employment qualified applicants with arrest and conviction records. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

Data Analyst III Who We Are Brightly, a Siemens company, is the global leader in intelligent asset management solutions. Brightly enables organizations to transform the performance of their assets with a sophisticated cloud-based platform that leverages several years of data to deliver predictive insights that help users through the key phases of the entire asset lifecycle. More than 12,000 clients of every size worldwide depend on Brightly’s complete suite of intuitive software – including CMMS, EAM, Strategic Asset Management, Sustainability and Community Engagement. Paired with award-winning training, support and consulting services, Brightly helps light the way to a bright future with smarter assets and sustainable. About The Job The Business Intelligence (BI) Analyst and Report Development professional for Brightly is a lead specialist in our Analytics and BI Services team responsible for building, testing, and maintaining software Product embedded reports, charts and dashboards in Power BI and/or QLIK. This position will also partner with and guide other Product report writers and end users in the development of their own reports. By providing best in class enterprise reporting, the Report Writer directly contributes towards Brightly’s objective to differentiate with data. What You’ll Be Doing Address reporting needs of applications by modernizing and building new embedded reports using Power BI or in some case QLIK Cloud Develop appropriate Semantic Models and Business Views, generate calculated fields based on application specific business logic and implement row level security (RLS) in the application reports or dashboards Support end-user community in the use of business intelligence tools creation of ad-hoc reports. Ongoing technical documentation for Brightly BI Services sustainability and scale including data sources, logic, processes, and limitations Work closely with multiple stakeholders such as Product Management, Analytics, Design, and Data Cloud teams Follow and influence reporting and data quality change control processes for proper configuration and application change management that will impact reports What You Need A Bachelor's degree in Business, Programming, Business Intelligence, Computer science or related field Minimum 6 years of experience developing reports in Power BI (some may be using similar tools) and familiarity with reports embedding in the applications Proficiency in using SQL, with experience in querying and joining tabular data structures, database management, and creating new variables required for reports Expertise in building intuitive, interactive dashboards and pixel perfect reports/Power BI paginated reporting. Advance level of knowledge in Power BI Desktop Reporting (Including all sub-components such as Power Query, Semantic Data Modelling, DAX and Visualizations) Strong experience and knowledge of Power BI Services (Ex. Gateway, B2B Applications, Workspaces etc.) Willingness to learn general international data security issues and follow data governance. Ability to communicate and collaborate in a remote team setting both reading, writing, and speaking English Ability to manage multiple priorities and adjust quickly to changing requirements and priorities Performs other related duties as assigned The Brightly Culture Service. Ingenuity. Integrity. Together. These values are core to who we are and help us make the best decisions, manage change, and provide the foundations for our future. These guiding principles help us innovate, flourish and make a real impact in the businesses and communities we help to thrive. We are committed to the great experiences that nurture our employees and the people we serve while protecting the environments in which we live Together We Are Brightly Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Key Responsibilities: Test Strategy & Planning: Develop and implement robust test strategies, detailed test plans, and comprehensive test cases for ETL processes, data migrations, data warehouse solutions, and data lake implementations. Ab Initio ETL Testing: Execute functional, integration, regression, and performance tests for ETL jobs developed using Ab Initio Graphical Development Environment (GDE), Co>Operating System, and plans deployed via Control Center. Validate data transformations, aggregations, and data quality rules implemented within Ab Initio graphs. Spark Data Pipeline Testing: Perform hands-on testing of data pipelines and transformations built using Apache Spark (PySpark/Scala Spark) for large-scale data processing in batch and potentially streaming modes. Verify data correctness, consistency, and performance of Spark jobs from source to target. Advanced Data Validation & Reconciliation: Perform extensive data validation and reconciliation activities between source, staging, and target systems using complex SQL queries. Conduct row counts, sum checks, data type validations, primary key/foreign key integrity checks, and business rule validations. Data Quality Assurance: Identify, analyze, document, and track data quality issues, anomalies, and discrepancies across the data landscape. Collaborate closely with ETL/Spark developers, data architects, and business analysts to understand data quality requirements, identify root causes, and ensure timely resolution of defects. Documentation & Reporting: Create and maintain detailed test documentation, including test cases, test results, defect reports, and data quality metrics dashboards. Provide clear and concise communication on test progress, defect status, and overall data quality posture to stakeholders. Required Skills & Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 3+ years of dedicated experience in ETL/Data Warehouse testing. Strong hands-on experience testing ETL processes developed using Ab Initio (GDE, Co>Operating System). Hands-on experience in testing data pipelines built with Apache Spark (PySpark or Scala Spark). Advanced SQL skills for data querying, validation, complex joins, and comparison across heterogeneous databases (e.g., Oracle, DB2, SQL Server, Hive, etc.). Solid understanding of ETL methodologies, data warehousing concepts (Star Schema, Snowflake Schema), and data modeling principles. Experience with test management and defect tracking tools (e.g., JIRA, Azure DevOps, HP ALM). Excellent analytical, problem-solving, and communication skills, with a keen eye for detail. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description Amazon’s Transportation team is seeking highly skilled and motivated person to help develop a and implement a world class security program for our transportation network which will ensure that our customers receive the items they purchase on time and at the best possible cost. Amazon is one of the most recognizable brand names in the world and we distribute millions of products each year to our loyal customers. The SLP Manager – TBA, will be responsible for partnering with Sort Center & Delivery Centre Operations team spread across various cities within a region to execute company security policies and provide security services and asset (lives, inventory in transit and within sort center, buildings, equipment, data, & intellectual property) protection within the assigned location and the surrounding geography. The Manager is a key member of the Transportation organization, working with the Regional team as well as cross functional teams throughout the organization. Perform risk assessment of site & operation model and frame mitigation measures Possess a thorough understanding of central/state security issues and demonstrate excellence in ability to implement and ensure sites compliance with company security policies and any industry or merchant requirements. Completing and/or coordinating the final Test and Acceptance of site security systems that leverage our access control system. Establish and implement effective, predictable, measurable procedures/processes and prevention programs impacting losses, pilferage, accident trends and conduct job hazard and job safety analyses Perform frequent site security audits to identify all non-compliance equipment and/or processes at the site. Implement solutions to eliminate exposure to these risks and prevent injury. Ensure guarding vendor(s) have clear understanding of expectations and hold them accountable to deliver on them and meet or surpass service level agreement requirements. In addition, work with the guarding vendor’s management to ensure that they recruit, hire, and retain candidates who raise the performance bar of the security services organization Builds and deploys security training program Serve as department’s liaison and security subject matter expert Effectively address safety and security incidents including potential and actual work place violence incidents per policy as well as conducting testing of the incident response plans. Enhance, track, and report on metrics which are key performance indicators Coordinate with various support teams such as the Worldwide Operations Security Team, IT Security, and Network Engineering as needed Utilize Kaizen, Lean and Six Sigma methods to drive process improvements and increase efficiency. Basic Qualifications Minimum graduate with 5 years plus of experience on investigative or loss prevention field, preferably in a multinational environment or Minimum 10 years of armed forces or law enforcement service experience with at least 2 years of corporate Security/Loss Prevention Experience. - Strong analytical and problem solving skills -Advanced level of computer literacy including proficiency in MS office package -Strong communication skills & fluent knowledge of verbal & written English/ vernacular language. -Demonstrated ability to deal with business tools & understand business metrics -Demonstrated ability to perform in pressure environment with adherence to timelines -Critical thinking & attention to detail of a narrative -Strong interpersonal skills & proven experience in managing stakeholders and vendors -Strong business ethics, discretion Preferred Qualifications Proven ability to work with and effectively persuade facility site leaders and other key departments within the organization. Analytical leader experienced in performance based, action and results oriented management, strong project manager and effective problem-solver. Strong familiarity with data bases (querying and analyzing) such as SQL, MYSQL, Access, Exception Based Reporting, etc. is considered a plus. Experience with delivery stations or cargo handling stations and transportation network security is preferred. Must have strong oral and written communication skills in English. Security Certification such as CPP, PCI, CFE etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Maharashtra - C32 Job ID: A2967219 Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 1 week ago

Apply

30.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

For over 30 years, Beghou Consulting has been a trusted adviser to life science firms. We combine our strategic consulting services with proprietary technology to develop custom, data-driven solutions that allow life sciences companies to take their commercial operations to new heights. We are dedicated to client service and offer a full suite of consulting and technology services, all rooted in advanced analytics, to enhance commercial operations and boost sales performance. Purpose of Job As a Software Developer in the Delivery team, you will work on implementing and customizing Mainsail™ modules to meet client-specific business needs. You will contribute to UI configuration, functional testing, and client data integration tasks. You’ll collaborate with consultants and senior developers to translate requirements into reliable, scalable delivery components. We'll trust you to: Develop and configure front-end components using HTML, CSS, and templating standards Assist with integrating dynamic elements and adjusting UI behaviour using JavaScript Write and execute SQL scripts for data preparation and validation tasks Participate in internal requirement reviews and implementation walkthroughs Troubleshoot issues during testing and support defect resolution Maintain structured documentation for delivery tasks and technical references You'll need to have: Working knowledge of HTML and CSS (capable of independent modifications) Basic JavaScript proficiency (able to follow and adjust existing logic) SQL for querying and transforming data Familiarity with .NET-based API structures and request/response patterns is a plus Attention to detail and structured problem-solving Collaboration with multi-functional delivery teams Ability to work with technical documentation and templates What you should know: · We treat our employees with respect and appreciation, not only for what they do but who they are. · We value the many talents and abilities of our employees and promote a supportive, collaborative, and dynamic work environment that encourages both professional and personal growth. · You will have the opportunity to work with and learn from all levels in the organization, allowing everyone to work together to develop, achieve, and succeed with every project. · We have had steady growth throughout our history because the people we hire are committed not only to delivering quality results for our clients but also to becoming leaders in sales and marketing analytics. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Note: Only candidates with up to 30 days official notice period will be considered. If shortlisted, we will reach out via WhatsApp and email – please respond promptly. Work Type: Full-time | On-site Compensation (Yearly): INR(₹) 1,200,000 to 2,400,000 Working Hours: Standard Business Hours Location: Bengaluru / Gurugram / Nagpur Notice Period: Max 30 days About The Client A technology-driven product engineering company focused on embedded systems, connected devices, and Android platform development. Known for working with top-tier OEMs on innovative, mission-critical projects. About The Role We are hiring a skilled Data Engineer (FME) to develop, automate, and support data transformation pipelines that handle complex spatial and non-spatial datasets. This role requires hands-on expertise in FME workflows, spatial data validation, PostGIS, and Python scripting, with the ability to support dashboards and collaborate across tech and ops teams. Must-Have Qualifications Bachelor’s degree in Engineering (B.E./B.Tech.) 4–8 years of experience in data integration or ETL development Proficient in building FME workflows for data transformation Strong skills in PostgreSQL/PostGIS and spatial data querying Ability to write validation and transformation logic in Python or SQL Experience handling formats like GML, Shapefile, GeoJSON, and GPKG Familiarity with coordinate systems and geometry validation (e.g., EPSG:27700) Working knowledge of cron jobs, logging, and scheduling automation Preferred Tools & Technologies ETL/Integration: FME, Python, Talend (optional) Spatial DB: PostGIS, Oracle Spatial GIS Tools: QGIS, ArcGIS Scripting: Python, SQL Formats: CSV, JSON, GPKG, XML, Shapefiles Workflow Tools: Jira, Git, Confluence Key Responsibilities The role involves designing and automating ETL pipelines using FME, applying custom transformers, and scripting in Python for data validation and transformation. It requires working with spatial data in PostGIS, fixing geometry issues, and ensuring alignment with required coordinate systems. The engineer will also support dashboard integrations by creating SQL views and tracking processing metadata. Additional responsibilities include implementing automation through FME Server, cron jobs, and CI/CD pipelines, as well as collaborating with analysts and operations teams to translate business rules, interpret validation reports, and ensure compliance with LA and HMLR specifications. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

ob Summary: We are seeking a highly skilled and experienced Senior Software .NET Engineer to join our innovative IOLX team. You will play a pivotal role in designing, developing, and scaling the next generation of our travel technology platform. Leveraging your deep expertise in C#, ASP.NET 6, Azure, and particularly Elastic Search, you will architect and build robust, high- performance web applications and backend services. The ideal candidate is a proactive, detail- oriented engineer with a strong architectural mindset, a passion for solving complex technical challenges, and a drive for delivering high-quality, scalable software solutions in the dynamic travel tech space. Key Responsibilities: • Architect & Design: Lead the design and architecture of complex, scalable, and resilient software systems using C#, ASP.NET 8 & 9, and Azure cloud services. • Develop & Implement: Build, test, and deploy high-quality, maintainable, and efficient code for web applications, APIs, and backend services. • Elastic Search Mastery: Design, implement, manage, and optimize sophisticated Elastic Search solutions, including index strategies, complex querying, performance tuning, and cluster management, ensuring optimal performance and relevance for our platform's search capabilities. • Azure Cloud Expertise: Architect, implement, and manage secure, scalable, and reliable solutions utilizing a range of Azure services (e.g., Azure App Service, Azure SQL Database, Azure Storage, KeyVault, ServiceBus, Azure Functions, AKS). • API Development: Design, develop, and maintain robust RESTful APIs for internal services and external partner integrations. • Technical Leadership: Contribute significantly to technical discussions, influence architectural decisions, and mentor junior engineers.• Collaboration: Work closely with product managers, designers, and other engineers in an Agile environment to understand requirements, define technical solutions, and deliver features. • Optimization & Scalability: Proactively identify and address performance bottlenecks, scalability issues, and potential areas for architectural improvement. • Best Practices: Champion and adhere to software development best practices, including clean code principles, SOLID design, TDD/BDD, CI/CD, and secure coding standards. • Stay Current: Keep abreast of the latest industry trends, technologies, and best practices related to .NET, Azure, Elastic Search, and software architecture. Required Qualifications & Skills: • Bachelor's degree in Computer Science, Software Engineering, or a related technical field (or equivalent practical experience). • Extensive Experience (5+ years): Proven track record designing and developing complex applications using C# and the .NET framework, with strong, recent experience in ASP.NET 8 & 9 / .NET 6+ . • Deep Elastic Search Expertise (Minimum 5 years): Demonstrable, in-depth, hands- on experience architecting, implementing, managing, and optimizing Elastic Search clusters in production environments. Expertise must include index design, query optimization (DSL), performance tuning, monitoring, and integration strategies. • Strong Architectural Skills: Proven ability to design scalable, resilient, maintainable, and high-performance distributed systems and microservices architectures. • Azure Cloud Proficiency: Significant hands-on experience designing, deploying, and managing applications on Microsoft Azure, utilizing services like App Services, Azure SQL, Storage, Key Vault, Service Bus, and Azure DevOps. • Database Knowledge: Solid experience with relational databases (PostgreSQL strongly preferred) and familiarity with NoSQL concepts. • API & Integration Expertise: Strong understanding and practical experience with RESTful API design, development, and integration with third-party services. • Modern Development Practices: Experience with version control systems (Git), CI/CD pipelines (Azure DevOps preferred), and Agile methodologies. • Problem-Solving: Excellent analytical and critical thinking skills with a proven ability to tackle complex technical challenges. • Communication & Teamwork: Strong verbal and written communication skills, with the ability to collaborate effectively within a cross-functional team. • Adaptability: Ability to thrive in a fast-paced, collaborative, and evolving environment. Preferred Qualifications & Skills: • Experience specifically within the travel, hospitality, or fintech industries. • Experience with containerization technologies (Docker, Kubernetes/AKS). • Familiarity with event-driven architectures and messaging systems (e.g., Kafka, RabbitMQ, Azure Event Grid/Hub). • Experience with front-end technologies (e.g., React, Angular, Vue.js) is a plus, but not required. • Knowledge of other search technologies or data stores. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Note: Only candidates with up to 30 days official notice period will be considered. If shortlisted, we will reach out via WhatsApp and email – please respond promptly. Work Type: Full-time | On-site Compensation (Yearly): INR(₹) 1,200,000 to 2,400,000 Working Hours: Standard Business Hours Location: Bengaluru / Gurugram / Nagpur Notice Period: Max 30 days About The Client A technology-driven product engineering company focused on embedded systems, connected devices, and Android platform development. Known for working with top-tier OEMs on innovative, mission-critical projects. About The Role We are hiring a skilled Data Engineer (FME) to develop, automate, and support data transformation pipelines that handle complex spatial and non-spatial datasets. This role requires hands-on expertise in FME workflows, spatial data validation, PostGIS, and Python scripting, with the ability to support dashboards and collaborate across tech and ops teams. Must-Have Qualifications Bachelor’s degree in Engineering (B.E./B.Tech.) 4–8 years of experience in data integration or ETL development Proficient in building FME workflows for data transformation Strong skills in PostgreSQL/PostGIS and spatial data querying Ability to write validation and transformation logic in Python or SQL Experience handling formats like GML, Shapefile, GeoJSON, and GPKG Familiarity with coordinate systems and geometry validation (e.g., EPSG:27700) Working knowledge of cron jobs, logging, and scheduling automation Preferred Tools & Technologies ETL/Integration: FME, Python, Talend (optional) Spatial DB: PostGIS, Oracle Spatial GIS Tools: QGIS, ArcGIS Scripting: Python, SQL Formats: CSV, JSON, GPKG, XML, Shapefiles Workflow Tools: Jira, Git, Confluence Key Responsibilities The role involves designing and automating ETL pipelines using FME, applying custom transformers, and scripting in Python for data validation and transformation. It requires working with spatial data in PostGIS, fixing geometry issues, and ensuring alignment with required coordinate systems. The engineer will also support dashboard integrations by creating SQL views and tracking processing metadata. Additional responsibilities include implementing automation through FME Server, cron jobs, and CI/CD pipelines, as well as collaborating with analysts and operations teams to translate business rules, interpret validation reports, and ensure compliance with LA and HMLR specifications. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Detailed Job Description For Solution Architect At PAN India Architectural Assessment Road mapping Conduct a comprehensive assessment of the current R&D Data Lake architecture. Propose and design the architecture for the next-generation self-service R&D Data Lake based on defined product specifications. Contribute to defining a detailed architectural roadmap that incorporates the latest enterprise patterns and strategic recommendations for the engineering team. Data Ingestion & Processing Enhancements Design and prototype updated data ingestion mechanisms that meet GxP validation requirements and improve data flow efficiency. Architect advanced data and metadata processing techniques to enhance data quality and accessibility Storage Patterns Optimization Evaluate optimized storage patterns to ensure scalability, performance, and cost-effectiveness. Design updated storage solutions aligned with technical roadmap objectives and compliance standards. Data Handling & Governance Define and document standardized data handling procedures that adhere to GxP and data governance policies. Collaborate with governance teams to ensure procedures align with regulatory standards and best practices. Assess current security measures and implement robust access controls to protect sensitive R&D data. Ensure that all security enhancements adhere to enterprise security frameworks and regulatory requirements. Design and implement comprehensive data cataloguing procedures to improve data discoverability and usability. Integrate cataloguing processes with existing data governance frameworks to maintain continuity and compliance. Recommend and oversee the implementation of new tools and technologies related to ingestion, storage, processing, handling, security, and cataloguing. Design and plan to ensure seamless integration and minimal disruption during technology updates. Collaborate on the ongoing maintenance and provide technical support for legacy data ingestion pipelines throughout the uplift project. Ensure legacy systems remain stable, reliable, and efficient during the transition period Work closely with the R&D IT team, data governance groups, and other stakeholders for coordinated and effective implementation of architectural updates. Collaborate in the knowledge transfer sessions to equip internal teams to manage and maintain the new architecture post-project. Required Skills Bachelor’s degree in Computer Science, Information Technology, or a related field with equivalent hands-on experience. Minimum 10 years of experience in solution architecture, with a strong background in data architecture and enterprise data management Strong understanding of cloud-native platforms, with a preference for AWS. Knowledgeable in distributed data architectures, including services like S3, Glue, and Lake Formation. Proven experience in programming languages and tools relevant to data engineering (e.g., Python, Scala). Experienced with Big Data technologies like: Hadoop, Cassandra, Spark, Hive, and Kafka. Skilled in using querying tools such as Redshift, Spark SQL, Hive, and Presto. Demonstrated experience in data modeling, data pipelines development and data warehousing. Infrastructure And Deployment Familiar with Infrastructure-as-Code tools, including Terraform and CloudFormation. Experienced in building systems around the CI/CD concept. Hands-on experience with AWS services and other cloud platforms. Show more Show less

Posted 1 week ago

Apply

6.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Deloitte Position Summary Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. Copyright © 2017 Deloitte Development LLC. All rights reserved. Assistant Manager, Data-Marts & Analytics - Reporting and Analytics – Digital Data Analytics Innovation - Deloitte Support Services India Private Limited Are you a quick learner with a willingness to work with new technologies? Data-Marts and Reporting team offers you a particular opportunity to be an integral part of the Datamarts & Reporting – CoRe Digital | Data | Analytics | Innovation Group. The principle focus of this group is the research, development, maintain and documentation of customized solutions that e-enable the delivery of cutting- edge technology to firm's business centers. Work you will do As an Assistant Manager, you will be involved in Leading people, Project Management, customer interactions, research and develop solutions built on varied technologies like MS SQL Server, MS Azure SQL, MSBI Suite, Azure Data Factory, Azure Cloud Services, Python, Tableau, .Net. You will support a team which provides high-quality solutions to the customers by following a streamlined system development methodology. In the process of acquainting yourself with various development tools, testing tools, methodologies and processes, you will be aligned to the following role: Role: Datamart Solution – Assistant Manager As a Datamart Solution Assistant Manager, you will be responsible for Leading People, Project Management, customer interactions, research and develop solutions built on varied technologies like MS SQL Server, MS Azure SQL, MSBI Suite, Azure Data Factory, Azure Cloud Services, Python, Tableau, .Net. Your key responsibilities include: Interact with end users to gather, document, and interpret requirements. Leads multiple projects from a Project Management and Team Management perspective. Coordinates with USI/US Managers on project prioritization, team leverage and utilization, and performance measurement Works closely with project teams to ensure smooth and timely delivery including communication and escalations as required. Should possess an innovative mindset in exploring new tools, technologies and capabilities. Able to do hands-on coding/trouble shooting for complex projects. Responsible for people management of his/her team, including mentoring and counselling and presenting counselees in Performance Reviews Proficient in project management, work planning, and allocation of resources. Involved in reviewing design for project delivery and facilitating client interaction and effort estimation. Maintains knowledge base of the functional capabilities and is a Subject Matter Expert for the projects assigned. Nurture, develop and retain talent by coaching, counseling and mentoring team members. Develop SQL objects and scripts based on design. Analyze, debug, and optimize existing stored procedures and views. Leverage indexes, performance tuning techniques, and error handling to improve performance of SQL scripts. Create, schedule, and monitor SQL jobs. Should be proficient in building pipelines in SSIS and ADF. Should possess relevant experience in working with different sources and destinations. Should have knowledge on sourcing data through API calls. Should possess good knowledge in building applications and automate processes using Python or .Net. Should be capable of building solution’s by leveraging different Azure Services in a cost effective and efficient way. Proactively prioritize activities, handle tasks, and deliver quality solutions on time. Communicate clearly and regularly with team leadership and project teams. Assist Manager in developing/improving the capabilities of team. Provide appropriate feedback to Manager/leaders on team members whenever required. Involve in solution design sessions with clients/stakeholder, provide them best solution and deliver task. Should be able to effectively track time, bandwidth of team members in creating a proper pipeline for them. About Deloitte Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. Copyright © 2017 Deloitte Development LLC. All rights reserved. Take active part in firm wide initiatives and encourage team members to do so. Manage ongoing deliverable timelines and own relationships with end clients to understand if deliverables continue to meet client’s need. Work collaboratively with other team members and end clients throughout development life cycle. Research, learn, implement, and share skills on new technologies. Understand the customer requirement well and provide status update to project lead (US/USI) on calls and emails efficiently. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Continuously improves skills in this space by completing certification and recommended training. Good understanding of MVC .Net, SharePoint, Azure, MSBI. The team CoRe - Digital Data Analytics Innovation (DDAI) sits inside Deloitte’s global shared services organization and serves Deloitte’s 10 largest member firms worldwide. The Reporting and Analytics team creates user experiences that delivers relevant content and connections, enables the discovery of impactful knowledge and delights users by delivering innovative applications that use emerging technologies. Qualifications and experience Required: Educational Qualification: B.E/B.Tech or MTech (60% or 6.5 GPA and above) Should be proficient in understanding of one or more of the following Technologies: Extensive knowledge in DBMS concepts, exposure to querying on any relational database preferably MS SQL Server, MS Azure SQL, SSIS and Tableau. Should have good experience in automating and generating insights using Python. Should possess good knowledge in Azure Services. Knowledge on any of the coding language like C#. NET or VB .Net would be added advantage. Should have a good experience in leading people. Good experience in Project Management. Understands development methodology and lifecycle. Excellent analytical skills and communication skills (written, verbal, and presentation) Ability to chart ones’ own career and build networks within the organization. Ability to work both independently and as part of a team with professionals at all levels. Ability to prioritize tasks, work on multiple assignments, and raise concerns/questions where appropriate. Seek information / ideas / establish relationship with customer to assess any future opportunities. Total Experience: 6 to 9 years Skill set: Required: SQL Server, MS Azure SQL, SSIS, Azure Data Factory, Python, Data warehouse and BI Preferred: Tableau, Azure Services. Good to have: MVC.Net (Full stack suite) How you will grow Location: Hyderabad Work hours: 2 p.m. – 11 p.m. About Deloitte Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. Copyright © 2017 Deloitte Development LLC. All rights reserved. At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team- based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world- class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/project requirements and at the discretion of the management. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304415 Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities Snowflake,SQL, Python must have a strong background in data warehousing concepts, ETL development, data modeling, metadata management, and data quality. Implementing ETL pipelines within a data lake using Python and Snowflakes Snow SQL, Snow Pipe . Experience in writing and debugging complex SQL stored procedures . Experience in administering Snowflake, Creation of user accounts and security groups, data loading from datalake/S3 . Querying Snowflake using SQL Development of scripts using Unix, Python, etc. for loading, extracting, and transforming data. Assist with production issues in Enterprise Data Lake like reloading data, transformations, and translations Develop a Database Design and Reporting Design based on Business Intelligence and Reporting requirements Mandatory Skill Sets Snowflake, SQL, Python Preferred Skill Sets Snowflake, SQL, Python Years Of Experience Required 2-4 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Snowflake Schema Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 7 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

3.0 years

5 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Job Summary We are seeking a highly skilled Data Scientist with Business analysis with good hands on in Python , SQL , EDA Data Visualization with 3 to 12+ years of experience Key Responsibilities: Perform exploratory data analysis (EDA) to uncover trends, patterns, and insights. Develop and maintain dashboards and reports to visualize key business metrics. Collaborate with cross-functional teams to gather requirements and deliver data-driven solutions. Build and evaluate statistical and machine learning models to support predictive analytics. Translate business problems into analytical frameworks and communicate findings effectively. Must-Have Skills: Proficiency in Python for data manipulation and analysis. Strong knowledge of SQL for querying and managing data. Experience in data analysis and business analytics. Familiarity with machine learning and statistical modelling. Ability to interpret and communicate complex data insights to non-technical stakeholders. Good-to-Have Skills: Experience with data visualization tools. Exposure to Generative AI frameworks and applications. Understanding of cloud platforms (AWS, GCP, or Azure) is a plus.

Posted 1 week ago

Apply

Exploring Querying Jobs in India

The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.

Career Path

In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager

Related Skills

Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts

Interview Questions

  • What is the difference between SQL and NoSQL databases? (basic)
  • Explain the purpose of the GROUP BY clause in SQL. (basic)
  • How do you optimize a slow-performing SQL query? (medium)
  • What are the different types of joins in SQL? (medium)
  • Can you explain the concept of ACID properties in database management? (medium)
  • Write a query to find the second-highest salary in a table. (advanced)
  • What is a subquery in SQL? Provide an example. (advanced)
  • Explain the difference between HAVING and WHERE clauses in SQL. (advanced)

Closing Remark

As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies