Home
Jobs

8411 Query Jobs - Page 45

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Position Details Total Years of Experience: 2-9 Years Primary Technologies: SQL, Power BI, Excel, Python, Addnl: Azure Synapse, Databricks, Spark, Warehouse Architecture & Development The Business Intelligence (BI) Engineer is responsible for assisting the specified Human Resource team in the continuous management of all relevant analytics. This position will collect and analyze the data to measure the impact of initiatives to support strategic business decision making. This position is responsible for working with developers to provide the business at all levels with relevant, intuitive, insight-driven information that is directly actionable. The Business Intelligence Engineer will become closely integrated with business and build a strong relationship with business leaders. This position will work with multi-national teams in an Agile framework and design and implement actionable reports and dashboards, assisting in designing the broader information landscape available to the business. Primary Job Functions Collaborate directly with the business teams to understand performance drivers and trends in their area, provide insights, make recommendations and interpret new data and results. Design reports and dashboards for consumption by the business; oversee the development for production. Perform pro forma modeling and ad hoc analyses. Keep up to date on the best visualization practices and dashboard designs. Maintain standardized templates for reports and dashboards. Ensure standardization and consistency of reporting. Perform deep dive analyses into specific issues as needed. Define data needs and sources; evaluate data quality and work with data services team to extract, transform and load data for analytic discovery projects. Ensure BI tools are fully leveraged to provide the insights needed to drive performance. Interface closely with technology partners to manage analytical environment and acquire data sets. Utilize statistical and data visualization packages to develop innovative approaches to complex business problems. Analyze and communicate the effectiveness of new initiatives; draw insights and make performance improvement recommendations based upon the data sources. Use quantitative and qualitative methodologies to draw insights and support the continuous improvement of the business. Analyze initiatives and events utilizing transaction-level data. Ensure that appropriate data-driven reports and customer behavior insight continuously flow to management to help improve quality, reduce cost, enhance the guest experience, and deliver continued growth. Required Qualifications Proficient in working in Microsoft Azure services and/or other cloud computing environment Experience with Database Management Systems (DBMS), specifically SQL and NoSQL. Knowledge of an enterprise data visualization platform, such as Power BI, Big Query Advanced analytical and problem-solving skills Strong applied Algebraic skills Working knowledge of business statistical application and econometrics Project management skills Ability to digest business problems and translate needs into a data-centric context Ability to synthesize and analyze large sets of data to yield actionable findings Strong attention to detail Excellent verbal and written communication skills Handle multiple projects simultaneously within established time constraints Perform under strong demands in a fast-paced environment Work professionally with customers and co-workers to efficiently serve our customers, treating both with enthusiasm and respect If you feel you have the necessary skill sets and are passionate about the job, please send your profile to vthulasiram@ashleyfurnitureindia.com Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Title: Data Engineer Location: Baner, Pune (Hybrid) 6 to 12 Months contract Responsibilities: Design, develop, and execute robust scalable data pipelines to extract, transform, and load data from on-premises SQL Server databases to GCP Cloud SQL PostgreSQL. Analyze existing SQL Server schemas, data types, and stored procedures, and plan for their conversion and optimization for the PostgreSQL environment. Implement and support data migration strategies from on-premise or legacy systems to cloud environments, primarily GCP. Implement rigorous data validation and quality checks before, during, and after migration to ensure data integrity and consistency. Collaborate closely with Database Administrators, application developers, and business analysts to understand source data structures and target requirements. Develop and maintain scripts (primarily Python or Java) for automating migration tasks, data validation, and post-migration data reconciliation. Identify and resolve data discrepancies, performance bottlenecks, and technical challenges encountered during the migration process. Document migration strategies, data mapping, transformation rules, and post-migration validation procedures. Support cutover activities and ensure minimal downtime during the transition phase. Apply data governance, security, and privacy standards across data assets in the cloud. Refactor SQL Server stored procedures and business logic for implementation in PostgreSQL or application layer where applicable. Leverage schema conversion tools (e.g., pgLoader, custom scripts) to automate and validate schema translation from SQL Server to PostgreSQL. Develop automated data validation and reconciliation scripts to ensure row-level parity and business logic integrity post-migration. Implement robust monitoring, logging, and alerting mechanisms to ensure pipeline reliability and quick failure resolution using GCP-native tools (e.g., Stackdriver/Cloud Monitoring). Must-Have Skills: Expert-level SQL proficiency across T-SQL (SQL Server) and PostgreSQL with strong hands-on experience in data transformation, query optimization, and relational database design. Solid understanding and hands-on experience working with Relational Databases. Strong experience in data engineering, with hands-on work on cloud, preferrably GCP. Experience with data migration techniques and strategies between different relational database platforms. Hands-on experience on any Cloud Data and Monitoring services such as Relational Database services, Data Pipeline services, Logging and monitoring services, - with one of the cloud providers - GCP, AWS or Azure. Experience with Python or Java for building and managing data pipelines with proficiency in data manipulation, scripting, and automation of data processes. Familiarity with ETL/ELT processes and orchestration tools like Cloud Composer (Airflow). Understanding of data modeling and schema design. Strong analytical and problem-solving skills, with a keen eye for data quality and integrity Experience with version control systems like Git. Good-to-Have Skills Exposure to database migration tools or services (e.g., AWS DMS, GCP Database Migration Service, or similar). Experience with real-time data processing using Pub/Sub. Experience with shell scripting. Exposure to CI/CD pipelines for deploying and maintaining data workflows. Familiarity with NoSQL databases and other GCP data services (e.g., Firestore, Bigtable). Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Report Specialist about Randstad Randstad is the world’s largest talent company and a partner of choice to clients. We are committed to providing equitable opportunities to people from all backgrounds and help them remain relevant in the rapidly changing world of work. We have a deep understanding of the labor market and help our clients to create the high-quality, diverse, and agile workforces they need to succeed. Our 46,000 employees around the world make a positive impact on society by helping people to realize their true potential throughout their working life. about Randstad Global Capability Center: Randstad Global Capability Center, located in Hyderabad, India, is responsible for strategic delivery for Randstad markets and businesses globally. Through our centers of excellence of talent services, human resources, financial knowledge services, IT, and marketing, the Global Capability Center is a high growth Randstad is seeking a highly analytical and detail-oriented Reporting Specialist to join our dynamic team. If you have a passion for data, a knack for uncovering insights, and a desire to make a real impact on a global scale, this role is for you! Responsibilities create market reports: develop comprehensive market reports based in given templates for the different programs that Randstad is running for the different countries to report the performance, KPI’s achievement and trends. conduct global team and leadership reports: collect data related to the performance and engagement of Randstad's global programs to feed the global team with relevant data, identifying key trends, areas for improvement, and opportunities for growth. Collect and gather the data to support the preparation of insightful reports for Randstad's leadership team. data visualization and storytelling: present data in a clear and compelling way through presentations data collection and analysis: gather data from various sources such as Google Analytics, Hubspot, Big Query and other relevant platforms to extract and analyze data. collaboration and communication: collaborate effectively with various teams across Randstad's global. Qualifications bachelor's degree in Business Administration, Economics, Statistics, or a related field. proven experience as a reporting specialist, data analyst, or similar role. strong analytical and critical thinking skills. minimum 3 years of proven experience. meticulous attention to detail and accuracy. advanced proficiency in data analysis tools and techniques. experience with Google Analytics, Hubspot, BigQuery and other data extraction tools. excellent communication and presentation skills. ability to translate complex data into actionable insights. ability to work independently and as part of a team. preference for experience working in a multinational environment. Benefits competitive salary and benefits package. be part of a multicultural and collaborative team. enjoy a healthy work-life balance. chance to make a real impact on a global leader in the HR services industry work with a company committed to making a positive impact on society. Shift Hours 11 AM to 8 PM IST Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Dear Aspirants, We at ValueLabs have an Opening for PL/SQL Lead role. Below is the JD for the same.. Role : PL/SQL Lead Experience 6+ Years Location: Hyderabad Preferable Immediate Joiners only Responsibilities Minimum 6+ years of experience in PL/SQL programming and database development. Design, develop, and test complex PL/SQL procedures, functions, packages, triggers, views, and materialized views. Write complex SQL queries using joins, analytical functions, and aggregate functions. Optimize query performance using indexing, partitioning, and other techniques. Design and implement efficient collections-based solutions using PL/SQL. Use bulk collect and bulk binding to improve performance and reduce memory usage. Debug and troubleshoot complex PL/SQL code using tools like SQL Developer, TOAD, or SQL Plus. Identify and resolve issues related to performance, syntax, and logical errors Identify and address performance bottlenecks using hints, indexes, and other techniques. Use Oracle's Automatic Workload Repository (AWR) and Active Session History (ASH) to monitor and optimize database performance Team Leadership: Lead a team of developers to design, develop, and implement complex database applications using PL/SQL. Technical Expertise: Provide technical leadership and guidance to the team on PL/SQL programming, database design, and development methodologies. Knowledge Sharing: Share knowledge and expertise with team members to improve their skills and knowledge in PL/SQL programming and database development. Mentorship: Mentor junior developers to help them grow in their careers and develop their skills in PL/SQL programming and database development. Process Improvement: Identify areas for process improvement and implement changes to improve the team's efficiency and productivity. Communication: Communicate effectively with stakeholders, including business leaders, project managers, and team members, to ensure that project requirements and expectations are met. Interested candidates please share your profile with imranmohammed.1@valuelabs.com Show more Show less

Posted 4 days ago

Apply

3.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Experience 3 to 10 Years Required Qualifications: Data Engineering Skills 3–5 years of experience in data engineering, with hands-on experience in Snowflake and basic to intermediate proficiency in dbt. Capable of building and maintaining ELT pipelines using dbt and Snowflake, with guidance on architecture and best practices. Understanding of ELT principles and foundational knowledge of data modeling techniques (preferably Kimball/Dimensional). Intermediate experience with SAP Data Services (SAP DS), including extracting, transforming, and integrating data from legacy systems. Proficient in SQL for data transformation and basic performance tuning in Snowflake (e.g., clustering, partitioning, materializations). Familiar with workflow orchestration tools like dbt Cloud, Airflow, or Control M. Experience using Git for version control and exposure to CI/CD workflows in team environments. Exposure to cloud storage solutions such as Azure Data Lake, AWS S3, or GCS for ingestion and external staging in Snowflake. Working knowledge of Python for basic automation and data manipulation tasks. Understanding of Snowflake's role-based access control (RBAC), data security features, and general data privacy practices like GDPR. Key Responsibilities Design and build robust ELT pipelines using dbt on Snowflake, including ingestion from relational databases, APIs, cloud storage, and flat files. Reverse-engineer and optimize SAP Data Services (SAP DS) jobs to support scalable migration to cloud-based data platforms. Implement layered data architectures (e.g., staging, intermediate, mart layers) to enable reliable and reusable data assets. Enhance dbt/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design. Use orchestration tools like Airflow, dbt Cloud, and Control-M to schedule, monitor, and manage data workflows. Apply modular SQL practices, testing, documentation, and Git-based CI/CD workflows for version-controlled, maintainable code. Collaborate with data analysts, scientists, and architects to gather requirements, document solutions, and deliver validated datasets. Contribute to internal knowledge sharing through reusable dbt components and participate in Agile ceremonies to support consulting delivery. Skills: workflow orchestration,git,airflow,sql,gcs,elt pipelines,azure data lake,data modeling,ci/cd,dbt,cloud storage,ci,snowflake,data security,python,sap data services,data engineering,aws s3 Show more Show less

Posted 4 days ago

Apply

5.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title : Manager Department : Finance Sub Department : CMA Ships - Finance Reporting : Finance Controller / CFO Role Summary: Supervision & reporting of Agency accounting, Corporate / Trade Vendor Payments & Internal Finance Control. Core Responsibilities: Monthly management reporting of Profit & Loss with variance explanation and other MIS reporting as per management requirement within deadline / timeline. Controlling and monitoring opex costs Completion of Statutory, Group audit and HO Internal Audit within timeline & co-ordinate with statutory auditors for query resolution. Co-ordinate with other inter-departments for smooth completion of HO Internal Audit. Responsible for Internal Finance Control (IFC) activities for HO & statutory requirement. Monitor and co-ordination for monthly reporting done to HO. Coordination with GBS & HO to ensure expected Deliverables. Responsible for Corporate/ Trade vendor Invoicing booking & Payments, Petty Cash Accounting & GL activities includes timely completion of Bank Reconciliation, Accounting of Fixed Assets transactions & maintenance of Fixed Assets Register, Calculation of Depreciation, Calculation & Accounting of Agency Remuneration with co-ordination of GBS team. Preparation of yearly PAN India Agency Budget (CAPEX & OPEX) in co-ordination with all stakeholders or concern departments. Periodical Ledger Scrutiny of P&L & BS accounts and query resolution. Active involvement in any new HO projects related to Agency accounting. Active role in finance activities for other group entities. Participate in HO projects. Key Performance Indicators: Submission of various MIS reports within deadline Monitoring Internal Finance Controls. Timely finalization of Statutory / Group/ Internal audit. Qualifications and Skill Sets: Chartered Accountant with good academic knowledge of accounts and taxation. 5-7 years of Experience in Finalization of Accounting. Related Industry experience. Good Communication, Analytical, Interpersonal & Managerial Skills. Good IT skill, knowledge of MS Office & SAP base accounting software. Experience of Handling team of at least 5-6 members. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary Skill Name: Power BI with GCP developer Experience : 7 - 10 yrs Mandatory Skills : Power BI + GCP(Big Query) Required Skills & Qualifications: Power BI Expertise: Strong hands-on experience in Power BI development, including report/dashboard creation, DAX, Power Query, and custom visualizations. Semantic Model Knowledge: Proficiency in building and managing semantic models within Power BI to ensure consistency and user-friendly data exploration. GCP Tools: Practical experience with Google Cloud Platform tools, particularly BigQuery, Dataflow, and Cloud Storage, for managing large datasets and data integration. ETL Processes: Experience in designing and managing ETL (Extract, Transform, Load) processes using GCP services. SQL & Data Modeling: Solid skills in SQL and data modeling, particularly for BI solutions and creating relationships between different data sources. Cloud Data Integration: Familiarity with integrating cloud-based data sources into Power BI, including knowledge of best practices for handling cloud storage and data pipelines. Data Analysis & Troubleshooting: Strong problem-solving abilities, including diagnosing and resolving issues in data models, reports, or data integration pipelines. Communication & Collaboration: Excellent communication skills to work effectively with cross Show more Show less

Posted 4 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Dear Candidates, TCS is looking for S4 HANA BW Data Sphere consultant Experience: 7-10 years Location: Gurgaon Role : - Strong technical skills in SAP BW/4HANA, SAP Datasphere, and related technologies like SAP Analytics Cloud (SAC). Ability to design, implement and support data warehousing and reporting solutions, including data integration, modeling, and visualization. Key Skills and Responsibilities Data Modeling and Design: Designing and implementing data models, data flows, and data integration processes within SAP Datasphere, including creating CDS views and query views. Data Integration: Implementing data extraction, transformation, and loading (ETL) processes, handling data from various sources (SAP and non-SAP), and ensuring data quality and consistency. Data Sphere Expertise: Developing and maintaining Spaces, Local Tables, Views, Data Flow, Replication Flow, Transformation Flow, DAC, Task Chain, and Intelligent Lookup within SAP Datasphere. Reporting and Analytics: Building KPIs, creating analytical models, and developing dashboards and reports using SAP Analytics Cloud (SAC) and other relevant tools. Technical Skills: Strong ABAP, AMDP, SQL, and Python skills, with experience in HANA views, AMDP procedures, and hybrid architecture. Solution Design and Implementation: Designing and implementing data warehousing and reporting solutions, collaborating with cross-functional teams to gather functional requirements, and leading solution architecture workshops. Problem-solving and Communication: Excellent problem-solving, communication, and collaboration skills to troubleshoot issues, document solutions, and work effectively within teams. Experience: 2-4 years of hands-on experience with SAP Datasphere, 5+ years with SAP BW/4HANA, and experience with SAP S/4HANA or ECC functional areas and data models. Other Skills: Knowledge of BW Bridge, SAP Business Objects, and various SAP S/4HANA or ECC functional areas. Show more Show less

Posted 4 days ago

Apply

9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Role Expectations: Design, develop, and execute automated tests to ensure product quality in digital transformation initiatives. Collaborate with developers and business stakeholders to understand project requirements and define test strategies. Implement API testing using Mockito , Wiremock , and Stubs for effective validation of integrations. Utilize Kafka and MQ to test and monitor real-time data streaming scenarios. Perform automation testing using RestAssured, Selenium , and TestNG to ensure smooth delivery of applications. Leverage Splunk and AppDynamics for real-time monitoring, identifying bottlenecks, and diagnosing application issues. Create and maintain continuous integration/continuous deployment ( CI/CD ) pipelines using Gradle and Docker . Conduct performance testing using tools like Gatling and Jmeter to evaluate application performance and scalability. Participate in Test Management and Defect Management processes to track progress and issues effectively. Work closely with onshore teams and provide insights to enhance test coverage and overall quality. Qualifications: 9+ years of relevant experience in QA automation and Java Programming: Strong experience with Java 8 and above, including a deep understanding of the Streams API . Frameworks: Proficiency in SpringBoot and JUnit for developing and testing robust applications. API Testing: Advanced knowledge of RestAssured and Selenium for API and UI automation. Candidates must demonstrate hands-on expertise. CI/CD Tools: Solid understanding of Jenkins for continuous integration and deployment. Cloud Platforms: Working knowledge of AWS for cloud testing and deployment. Monitoring Tools: Familiarity with Splunk and AppDynamics for performance monitoring and troubleshooting. Defect Management: Practical experience with test management tools and defect tracking. Build & Deployment: Experience with Gradle for build automation and Docker for application containerization. SQL: Strong proficiency in SQL, including query writing and database operations for validating test results. Domain Knowledge: Prior experience in the Payments domain with a good understanding of the domain-specific workflows. Nice to Have: Data Streaming Tools: experience with Kafka (including basic queries and architecture) OR MQ for data streaming testing. Financial services or payments domain experience will be preferred. Frameworks: Experience with Apache Camel for message-based application integration. Performance Testing: Experience with Gatling and Jmeter for conducting load and performance testing. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Role Overview: We are looking for a highly motivated and detail-oriented Co Analyst to join our Institutional Equities Research team, focusing on the Auto Ancillaries sector . The role involves deep sectoral research, financial modelling, interaction with industry stakeholders, and supporting investment ideas for institutional clients. Key Responsibilities: Track and maintain detailed coverage of companies in the auto ancillary space Build and update financial models, forecasts, and valuation metrics Analyze industry trends, competitive dynamics, regulatory developments, and key performance indicators Prepare high-quality research reports including company updates, sector notes, thematic deep-dives, and event-based commentary Support lead analyst in preparing client presentations and investment theses Coordinate with corporates for management interactions and plant visits Assist in maintaining research databases and ensuring timely updates Engage with sales and clients for query resolution and data requests Desired Skills & Qualifications: Graduate/Postgraduate in Finance, Economics, Commerce, or Engineering; MBA/CFA/CA preferred Minimum 3 years of experience in equity research, consulting, investment banking, or a related analytical role (freshers with relevant internships may also be considered) Strong understanding of financial statements and valuation methodologies Interest in the auto and manufacturing ecosystem – value chain, technology shifts, and global benchmarks Excellent written and verbal communication skills Proficient in Excel, PowerPoint; familiarity with Bloomberg/Reuters an advantage Ability to work independently in a high-performance, deadline-driven environment Why Join Us: Opportunity to be part of a dynamic and fast-growing research team Exposure to institutional investors and senior management of listed companies Learning-driven culture with strong mentorship and ownership Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Company Description Team Geek Solutions (TGS) is a global technology partner based in Texas, specializing in AI and Generative AI solutions, custom software development, and talent optimization. TGS offers a range of services tailored to industries like BFSI, Telecom, FinTech, Healthcare, and Manufacturing. With expertise in AI/ML development, cloud migration, software development, and more, TGS helps businesses achieve operational efficiency and drive innovation. Note Only immediate joiners to apply. Candidate serving 0-15 days of notice to apply. Strong proficiency and experience of at least 6+ years in ServiceNow Notice Period : Immediate joiners or 0- 15 days serving Job Description We are seeking a talented and motivated ServiceNow Developer to join our team and play a key role in designing, developing, and implementing innovative solutions on the ServiceNow platform. Have to collaborate with business stakeholders to understand their needs and translate them into effective technical solutions using the ServiceNow platform. Responsibilities Collaborate with business analysts and stakeholders to understand business requirements, user stories, and workflows. Translate business requirements into technical specifications for ServiceNow development. Develop, configure, and customize ServiceNow modules (e.g., Incident Management, Change Management, Problem Management) using ServiceNow development tools (GlideScript, UI Builder). Create and maintain integrations between ServiceNow and other enterprise applications using APIs (REST). Write clean, efficient, maintainable, and well-documented code adhering to ServiceNow best practices. Conduct unit testing, integration testing, and regression testing to ensure functionality, performance, and security of developed solutions. Troubleshoot and resolve technical issues within ServiceNow applications, leveraging debugging techniques and knowledge of ServiceNow logs. Participate in code reviews to ensure quality and adherence to coding standards. Document technical designs, configurations, and code for future reference and knowledge transfer. Stay up-to-date with the latest ServiceNow features, updates, and security patches through continued learning and attending relevant training. Assist with the development and implementation of ServiceNow knowledge base articles and training materials. Worked in CMDB Workspace, CMDB Data Manager and CI’s life cycle management, CMDB Query Builder, IRE, CMDB Health 3’Cs, CI Audit, CMDB Compliance, CSDM Implementation. Making sure GCP discovery is fine on day to day basis. Troubleshoot if required based on various scenarios. Majorly these includes E to E Discovery, Service Mapping & Cloud management implementations. Also worked on some other areas like ITSM, Catalogs development, reporting and dashboarding, Asset management and PA. BMC, Solarwinds, Nagios, Performance manager tools. Working as ServiceNow Software Engineer. Have worked on ServiceNow ITOM Area, working on Operational Activities like CMDB issues, Discovery issues, Services/service offering issues, CMDB relationship, data propagation jobs & other foundation data issues. Discovery- Implemented Discovery for H&M & Resolving & fixing issues across discovery module/applications. Day to day activities on Discovery Schedules – Onprem/cloud. CMDB- Configure CMDB Health, Configuration View/Service View/Group View to support stakeholders to view the CMDB Health on 3’Cs- Completeness, Compliance, Correctness. Worked on IRE Rules & troubleshooting issues on IRE for Different Configurations Classes. Health Inclusion Setup, Relationship creation based on Accountable/responsible Ownership of Devices/Services. CMDB Modelling activities. ServiceNow Greenfield Implementation- Worked on New Greenfield ServiceNow Instance on Setting up ITOM project / Foundation Data / Integration – SCCM, Ldap across Dev/Prod/Test Environments. Service Mapping- Experience on Setting up 50+ Application Services Mapping & end to end troubleshooting steps to be followed & worked on Tags based Service mapping, Machine Learning, Pattern Based ( experience on Custom Pattern creation based on Application & organization requirements ). Skills: data,apis,itsm,ui builder,glidescript,catalogs development,reporting,discovery,bmc,cloud,code,cmdb,rest,machine learning,service mapping,management,solarwinds,nagios,performance manager tools,dashboarding,servicenow,asset management Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job description: Roles & Responsibilities: -Good understanding & writing skill of SQL code -Perform data analytics and ETL development -Perform Descriptive Analytics & Reporting -Perform peer code reviews, design documents & test cases -Support systems currently live and deployed for customers -Build knowledge repository & cloud capabilities -Excellent troubleshooting, attention to detail in fast-paced setting. -Excellent communication Skill is mandatory to work directly with the client. -Work as part of a team of Engineers/Consultants that globally ensure to provide customer support. -Understanding Agile Job Title : Gcp Python Key Skills : GCP cloud storage, Data proc, Big query, SQL - Strong SQL & Advanced SQL,Spark writing skills on Pyspark,DWH ,Python,GIT,Any GCP Certification Job Locations : Any Virtusa Experience : 4 - 6 Education Qualification : Any Graduation Work Mode : Hybrid Employment Type : Contract Notice Period : Immediate - 10 Days Payroll : people prime Worldwide Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary: Join our customer's team as a Software Developer and play a pivotal role in building high-impact backend solutions at the forefront of AI and data engineering. This is your chance to work in a collaborative, onsite environment where your technical expertise and communication skills will drive the success of next-generation AI/ML applications. Key Responsibilities: • Develop, test, and maintain scalable backend components and microservices using Python and PySpark. • Build and optimize advanced data pipelines leveraging Databricks and distributed computing platforms. • Design and administer efficient MySQL databases, focusing on data integrity, availability, and performance. • Integrate machine learning models into production-grade backend systems powering innovative AI features. • Collaborate with data scientists and engineering peers to deliver comprehensive, business-driven solutions. • Monitor, troubleshoot, and enhance system performance using Redis for caching and scalability. • Create clear technical documentation and communicate proactively with the team, emphasizing both written and verbal skills. Required Skills and Qualifications: • Proficient in Python for backend development with strong coding standards. • Practical experience with Databricks and PySpark in live production environments. • Advanced knowledge of MySQL database design, query optimization, and maintenance. • Solid foundation in machine learning concepts and deploying ML models in backend systems. • Experience utilizing Redis for effective caching and state management. • Outstanding written and verbal communication abilities with strong attention to detail. • Demonstrated success working collaboratively in a fast-paced onsite setting in Hyderabad. Preferred Qualifications: • Background in high-growth AI/ML or complex data engineering projects. • Familiarity with additional backend technologies or cloud-based platforms. • Experience mentoring or leading technical teams. Be a key contributor to our customer's team, delivering backend systems that seamlessly bridge data engineering and AI innovation. We value professionals who thrive on clear communication, technical excellence, and collaborative problem-solving. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

Job Title: BI Engineer – Amazon QuickSight Developer Job Summary We are seeking an experienced Amazon QuickSight Developer to join our BI team. This role requires deep expertise in designing and deploying intuitive, high-impact dashboards and managing all aspects of QuickSight administration. You’ll collaborate closely with data engineers and business stakeholders to create scalable BI solutions that empower data-driven decisions across the organization. Key Responsibilities Dashboard Development & Visualization Design, develop, and maintain interactive QuickSight dashboards using advanced visuals, parameters, and controls. Create reusable datasets and calculated fields using both SPICE and Direct Query modes. Implement advanced analytics such as level-aware calculations, ranking, period-over-period comparisons, and custom KPIs. Build dynamic, user-driven dashboards with multi-select filters, dropdowns, and custom date ranges. Optimize performance and usability to maximize business value and user engagement. QuickSight Administration Manage users, groups, and permissions through QuickSight and AWS IAM roles. Implement and maintain row-level security (RLS) to ensure appropriate data access. Monitor usage, SPICE capacity, and subscription resources to maintain system performance. Configure and maintain themes, namespaces, and user interfaces for consistent experiences. Work with IT/cloud teams on account-level settings and AWS integrations. Collaboration & Data Integration Partner with data engineers and analysts to understand data structures and business needs. Integrate QuickSight with AWS services such as Redshift, Athena, S3, and Glue. Ensure data quality and accuracy through robust data modeling and SQL optimization. Required Skills & Qualifications 3+ years of hands-on experience with Amazon QuickSight (development and administration). Strong SQL skills and experience working with large, complex datasets. Expert-level understanding of QuickSight security, RLS, SPICE management, and user/group administration. Strong sense of data visualization best practices and UX design principles. Proficiency with AWS data services including Redshift, Athena, S3, Glue, and IAM. Solid understanding of data modeling and business reporting frameworks. Nice to Have: Experience with Python, AWS Lambda, or automating QuickSight administration via SDK or CLI. Familiarity with modern data stack tools (e.g., dbt, Snowflake, Tableau, Power BI). Apply Now If you’re passionate about building scalable BI solutions and making data come alive through visualization, we’d love to hear from you! Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview: NRoot Labs is a cutting-edge technology company that specializes in delivering innovative data and cloud solutions for businesses worldwide. Our team of talented engineers and developers work collaboratively to create scalable and robust applications that drive our clients' success. Our contemporary work culture coupled with an employee-friendly environment makes NRoot Labs an amazing place to work. Job Title: Data Engineer Location: Chennai Experience: 4–6 Years Employment Type: Full-time Educational Qualification: B.Tech/M.Tech/B.Sc/MCA/MS Degree Role Overview: We are looking for a hands-on and technically strong Data Engineers with 4–6 years of experience in data engineering, ETL development and cloud platforms. The ideal candidate should have deep SQL expertise and a solid background in building scalable data pipelines and architectures. This role involves leading a small team and working closely with cross-functional stakeholders. Key Responsibilities: Lead the design and development of scalable, secure, and high-performance data pipelines. Develop and maintain efficient SQL queries, procedures, and database solutions. Design and manage ETL processes to integrate data from various sources. Work with cloud platforms (Azure or AWS) to implement and support cloud-based data solutions. Ensure data accuracy, consistency, and quality across all systems. Collaborate with business and technical teams to translate requirements into data solutions. Provide technical guidance and mentorship to junior team members. Drive best practices in data modelling, coding, performance tuning, and data governance. Required Skills: Strong SQL skills with experience in performance optimization and complex query writing. 4–6 years of experience in ETL development (e.g., SSIS or equivalent tools). Hands-on experience with cloud data services (Azure or AWS). Solid understanding of data warehousing, data modelling, and architecture principles. Experience in scripting or automation using Python or similar languages is a plus. Proven ability to manage tasks independently and lead small teams. Excellent problem-solving and communication skills. Preferred: Cloud certifications (e.g., Azure Data Engineer, AWS Data Analytics). Familiarity with CI/CD practices for data pipelines. Exposure to data lake, delta lake, or big data ecosystems Perks of working at NRoot Labs: From competitive salary recognizing your hard work and talent to flexible work life balance, comprehensive health insurance coverage, wellness programs, limitless growth opportunities, fun filled work environment, yummy snacks and beverages to casual dress codes, we strive to create an environment where you can thrive both personally and professionally! Show more Show less

Posted 4 days ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About Us : Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the role : Evangelize and demonstrate the value and impact of analytics for informed business decision-making by developing and deploying analytical solutions and providing data-driven insights to business stakeholders to understand and solve various business nuances. Responsibilities : 1. The role involves working closely with Product and Business stakeholders to empower data-driven decision-making and generate insights that will help grow the key metrics1. 2. Writing SQL/HIVE queries for data mining 3. Performing deep data analysis on MS Excel and sharing regular actionable insights 4. Responsible for performing data driven analytics to generate business insights 5. Automating the regular reports/MIS using tools like HIVE, Google Data Studio and coordinating with different teams 6. Strongly follow-up with concerned teams to make sure that our business & financial metrics are met 7. Look at data from various cuts / cohorts to suggest insights - Analysis based on multiple cohorts - Transaction, GMV, Revenue, Gross Margin, users etc. for both offline & online payments Mandatory Technical Skills needed : - 1. Distinctive problem solving and analysis skills, combined with impeccable business judgment. 2. Proficient in SQL/HIVE/Data Mining & Business Analytics - Proficient in Microsoft Excel. 3. Derive business insights from data with a focus on driving business level metrics. Eligibility Criteria : 1. Minimum 2 years of experience as Data Analyst / Business Analyst. 2. Ability to interact and convince business stakeholders. 3. Hands on with SQL (sub-query and complex query), Excel / Google Sheets, and data visualization tools (Looker studio, Power BI). 4. Ability to combine structured & unstructured data. 5. Worked on large datasets of the order of 5 Million 6. Experimentative mind-set with attention to detail. Compensation : If you are the right fit, we believe in creating wealth for you With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India's largest digital lending story is brewing here. It’s your opportunity to be a part of the story! Show more Show less

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Company : Our Client is a leading Indian multinational IT services and consulting firm. It provides digital transformation, cloud computing, data analytics, enterprise application integration, infrastructure management, and application development services. The company caters to over 700 clients across industries such as banking and financial services, manufacturing, technology, media, retail, and travel & hospitality. Its industry-specific solutions are designed to address complex business challenges by combining domain expertise with deep technical capabilities. With a global workforce of over 80,000 professionals and a presence in more than 50 countries. Job Title: Python Developer Locations: PAN INDIA Experience: 5-10 Years Employment Type: Contract to Hire Work Mode : Work From Office Notice Period : Immediate to 15 Days Job Description: We are seeking a proactive and skilled Python Developer to support and enhance our existing codebase hosted on Azure Function App. The ideal candidate will have a strong foundation in Python, APIs, and basic data science principles, with the ability to understand and modify machine learning-driven computations. This role involves maintaining and evolving cloud-hosted Python applications and collaborating with cross-functional teams. Key Responsibilities: Understand and maintain Python code hosted on Azure Function App. Modify and enhance existing code based on business requirements. Support basic data analysis and transformation logic using libraries like Pandas, NumPy, Scikit-learn, etc. Collaborate with DevOps and data engineering teams to ensure seamless deployment and integration. Perform basic statistical and mathematical analysis to support data-driven features. Troubleshoot and resolve issues in production and development environments. (Mainly on Function App and Python code) Document code changes and maintain version control using Azure DevOps. Mandatory Skills: Programming: Strong proficiency in Python. Data Science Libraries: Hands-on experience with Pandas, NumPy, Scikit-learn, etc. Mathematics & Statistics: Basic understanding of statistical concepts and mathematical computations. Version Control: Familiarity with Git and Azure DevOps. Good to Have Skills: Cloud & DevOps: Azure DevOps pipelines, CI/CD practices. Data Platforms: Experience with Snowflake and SQL-based querying. Monitoring & Logging: Application Insights, Log Analytics, Kusto Query Language (KQL) Machine Learning: Exposure to ML workflows and model deployment (basic level). Security & Authentication: Understanding of OAuth, API keys, and secure coding practices. Cloud Functions: Experience with Azure Function App (or AWS Lambda) Show more Show less

Posted 4 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role: React Js with Gen Ai We are looking for strong Reactjs and Next js with backend Node exposure. Strong communication with GEN AI experience. We are also looking for someone who are techno-manager and able to handle a team. Required Qualifications: 7+ years of professional experience in frontend development with a strong focus on React.js. Proficiency in JavaScript (ES6+), TypeScript, HTML5, and CSS3. Deep understanding of React Hooks, Context API, and component-based architecture. Proven experience integrating with RESTful APIs and/or GraphQL. Demonstrable experience integrating Generative AI components or LLM APIs into web applications (e.g., OpenAI API, Hugging Face Inference API, Vercel AI SDK, LangChain.js). Experience with handling API response streaming (e.g., Server-Sent Events). Strong experience with state management libraries (e.g., Zustand, Jotai, Redux, TanStack Query). Familiarity with modern frontend build pipelines and tools (e.g., Webpack, Vite, Babel, npm/yarn). Experience with testing frameworks (e.g., Jest, React Testing Library). Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

We are looking for a Azure Data Engineer with a driving passion to ensure that our customers have the most pleasant experience using our platform.This position will directly contribute to the WoW customer experience by consistently delivering the best quality of work. Our ideal candidate enjoys a work environment that requires strong problem solving skills and independent self-direction, coupled with an aptitude for team collaboration and open communication. Title: Azure Data Enginee rLocation: Remote Shift: 2:00 PM-11: 00 PM IS T Please note : This is pure Azure specific role, if your expertise is into AWS/GCP. Please avoid to apply for this rol e. Key Responsibilit iesDesign and implement robust data pipelines usi ng Azure Data Fact ory, ensuring seamless data flow across enterprise syste ms.Le ad data migrat ion initiatives, translating complex business requirements into efficient and scalable ETL process es.Architect and optimi ze Azure Data L ake solutions to support scalable storage and advanced analytics for big data workloa ds.Develop high-performan ce data transformation scri pts usi ng Pyt hon a nd PySp ark, enhancing data processing efficien cy.Write compl ex SQL quer ies and stored procedures to extract actionable insights from diverse data sourc es.Troubleshoot data integration issues and optimize system performance using advanc ed problem-solv ing techniqu es.Collaborate with cross-functional teams to align data engineering solutions with business objectives, demonstrating strong ownership and communication skil ls.Stay up to date with emergi ng Azure technolog ies and implement innovative solutions, embodying a continuous learning minds et.Deliver data-driven insights through innovati ve data model ing techniques to support informed decision-maki ng.Mentor junior engine ers, promoting a culture of excellence and continuous improvement in data engineering practic es.Foundational Ski llsAzure Data Fact ory: Proven expertise in designing, implementing, and managing scalable data integration workflo ws.Data Migrat ion: Successful track record in planning and executing large-scale migrations with a focus on data integrity and minimal downti me.Azure Data L ake: Deep understanding of architecture and best practices for scalable data storage and processi ng.Pyt hon: Strong programming skills for data manipulation, automation, and engineering workflo ws.PySp ark: Hands-on experience in distributed data processing for efficient big data handli ng. SQL: Advanced skills in query writing, data manipulation, and performance tuning across multiple database platfor ms.Analytical Think ing: Exceptional ability to resolve complex engineering challenges with scalable, efficient solutio ns. About Techolut ion:Techolution is a leading innovation consulting company on track to become one of the most admired brands in the world for "innovation done right". Our purpose is to harness our expertise in novel technologies to deliver more profits for our enterprise clients while helping them deliver a better h umanexperience for the communities they serve. With that, we are now fully committed to helping our clients build the enterprise of tomorrow by making the leap from Lab Grade AI to Real World AI. In 2019, we won the prestigious Inc. 500 Fastest-Growing Companies in America award, only 4 years after its formation. In 2022, Techolution was honored with the “Best-in-Business” title by Inc. for “Innovation Done Right”. Most recently, we received the “AIConics” trophy for being the Top AI Solution Provider of the Year at the AI Summit in New Y ork.Let’s give you more insig hts!One of our amazing products with Artificial Intellige nce:1. https://faceopen.co m / : Our proprietary and powerful AI Powered user identification system which is built on artificial intelligence technologies such as image recognition, deep neural networks, and robotic process automation. (No more touching keys, badges or fingerprint scanners ever aga in!)Some videos you wanna wa tch!Life at Techolu tionGoogleNext 2023Ai4 - Artificial Intelligence Conferences 2023WaWa - Solving Food Wast age Saving lives - Brooklyn Hosp italInnovation Done Right on Google C loudTecholution featured on Worldwide Business with KathyIre landTecholution presented by ION World’s Grea test Visi t us @www.techolutio n.c om : To know more about our revolutionary core practices and getting to know in detail about how we enrich the human experience with techno logy. Show more Show less

Posted 4 days ago

Apply

2.0 - 4.0 years

0 Lacs

India

On-site

Linkedin logo

Requirement: Exp: 2 to 4 years Notice period: Immediate or 10 days(If the candidate is shortlisted, they must be prepared to join ASAP. Please note that the joining date is fixed and non-negotiable). • Willingness to work Europe timings(12 Noon to 8.30 PM IST) • Minimum 2 years’ experience in HR Services and any HR application like Success factor, SAP GUI, Workday etc, Proficiency in SAP / Success Factors; MS Tools like SP, Excel; Knowledge of CRM Tools like Dynamics, ServiceNow, etc. • Graduation but Post-Graduation (any specialization) will be an advantage • Excellent written & verbal English communication - IMP • Exposure to Customer Relationship Management tools will be an added advantage (ServiceNow, CRM, SIEBEL etc.) • Knowledge of MS tools (SharePoint, Excel & PowerPoint) • Attention to detail and ability to follow guidelines • Ability to maintain highly confidential and sensitive information • Ability to deliver against agreed objectives/ service levels • Ability to work effectively in a team and willingness to help others Contract Description We are looking for a Contract Staff for HR Services to work on EMEA-related HR Operations – tasks and queries. The HR Services Delivery Center team plays a pivotal role in improving the Candidate, Employee, and Manager experience by providing timely and accurate query resolution, onboarding of candidates, maintaining accurate HR data of employees in HR Systems & supporting employee life cycle programs & processes (benefits, rewards, transfers, offboarding, etc.) Key Accountabilities: • Maintains efficient service delivery by ensuring transactional requests and assigned inquiries are completed within SLA depending on priority & complexity • Respond & resolve queries in a timely & accurate manner with employee experience at the core. • Accountable to ensure Employee HR records are accurately created and / or maintained in HR Systems (SAP, Success Factors, MS Vacation, etc.) • Takes complete ownership to close the data administration requests, including follow-up with the requestor to collect missing information and / or informing approval requirements • Maintains and follows the Desktop Procedures / KB articles defined for every transaction/query • Ensures the Maker-Checker process is followed, and Data Monitoring is done to ensure high quality of data in all HR tools • Working in a highly data sensitive environment, responsible for always protecting Data Privacy and adhering to confidentiality requirements to promote zero breach of compliance policies Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

TCS presents an excellent opportunity for "SAP HCM Consultant" Job Title: SAP HCM Consultant Location: Thane or Pune Experience Range: 8 Years & Above Job Description: Must-Have: OM, PA, ESS, MSS, Payroll Experience Criteria: 8+ years of experience SAP HR. Minimum 2 end to end implementation. Multiple Support projects Required Skills: Strong SAP experience with hands-on SAP HR configuration experience. Must have very good Configuration and design knowledge in following areas: Organizational Management Personnel Administration Employee Self Service and or Manager Self Service (Good exposure to portal and related functionalities) HR Fiori custom applications experiences in front end design and backend service approaches Compensation Basics of Benefits and Administration Basics of Time Management and Payroll Basics of SuccessFactors support Knowledge on Payroll Interfaces and FI Integration. Experienced in building Interfaces to support data transfer to and from multiple vendors Ability to create Training Materials and User Guides’. Well versed with tools like ABAP Query, and LSMW. Conversant with preparation of Functional Specifications for ABAP developments which includes objects like Reports, User Exits, Layouts and other Functional enhancements/Interfaces etc., Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Greetings from Teamware Solutions a division of Quantum Leap Consulting Pvt. Ltd We have the below requirement for one of the reputed MNC for Bangalore locations and require immediate joiners for the same. Required Skills:- • Experience in handling large data volumes. • Strong skills in SQL coding writing complex queries and dashboarding. • Experience in Python and any other analytics tools will be plus. • 5+ experience in Data analytics. • SQL, Python, Power BI, QV, Data Analytics • Good problem-solving skills and an analytical mindset • Ability to assess performance of data pipelines for latency and query run times • Ability to work in the English language with strong written and oral communication skills Total Experience: 5+years Locations: Bangalore Mode of Interview: Virtual Work Mode: Work from Office Notice Period: Immediate to max 15 days(if serving Notice Period) Share your resume to Junnaid Khurshid or junaid.k@twsol.com About Us: Teamware Solutions, a business division of Quantum Leap Consulting Private Limited (www.teamwaresolutions.net), offers cutting edge industry solutions for deriving business value for our clients' IT initiatives. Offering deep domain expertise in Banking, Financial Services and Insurance, Oil and Gas, Infrastructure, Manufacturing, Retail, Telecom and Healthcare industries, Teamware leads its service in offering skills augmentation and professional consulting services. Being an industry leader for the past 20 years, over 4000 professionals deputed across India, USA, Middle East and APAC, our major clients include Captive IT units in India, Product Companies and IT Services firms. Head quartered in Chennai, with our branches in Bangalore, Hyderabad, Pune and cater to the clients spread across pan India Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Chartered Accountant/ACCA/CPA 4+ years of experience in GAAP/IFRS financial reporting and multi-entity consolidation Key responsibilities Compile and analyze operational and financial data and reports, post journal entries in the ERP system, prepare and maintain accurate workpapers and reconciliations. Execute month-end, quarter-end and year-end closing activities and support group level financial consolidation in accordance with applicable GAAP/IFRS standards. Prepare financial statements and group reporting packages in alignment with group reporting requirements. Conduct variance analysis to identify anomalies, investigate routine discrepancies, and ensure timely resolution with appropriate documentation. Perform General ledger reconciliations and analyze the open items with supporting documentation. Prepare direct and indirect tax filings and assist in liaising with regulatory authorities for timely query resolution. Ensure comprehensive documentation and adherence to internal policies and controls and SOX compliance requirements. Coordinate with and support internal, statutory, tax and other auditors to solve audit queries. Identify and recommend opportunities for automation of processes contributing to operational excellence. Skills required In-depth understanding of the applicable GAAP/IFRS standards. Proven experience in managing month-end close processes including account reconciliation, journal entries, and financial risk mitigation. Skilled in conducting internal audits, implementing robust internal controls, and ensuring regulatory compliance. Advanced proficiency in Microsoft Excel for financial modeling, data analysis, and troubleshooting complex accounting issues. Demonstrates a collaborative mindset and ability to perform under pressure while maintaining high standards of accuracy, timeliness and efficiency. Proficient in the use of ERP systems (like MSBC) and Microsoft Office Suite (Excel, Word, Outlook). Excellent verbal and written communication skills with a strong ability to engage stakeholders. Professional accounting qualifications like CA/CPA/ACCA will be preferred. Equal Opportunity statement KPMG India and CISC have a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India and CISC value diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 4 days ago

Apply

0.0 - 1.0 years

0 Lacs

Mumbai, Maharashtra

On-site

Indeed logo

We are looking for a skilled and detail-oriented Power BI Developer to join our team. The ideal candidate will have hands-on experience in designing and developing insightful, high-performance dashboards, reports, and KPIs using Microsoft Power BI. You will play a key role in transforming data into valuable business insights and delivering top-tier analytics solutions to stakeholders. Key Responsibilities: Design and develop interactive reports, dashboards, and KPI scorecards using Power BI. Strong UI/UX design skills tailored for executive dashboards Experience in Power BI layout, theming and custom visuals Experience is defining and organizing measures and dimensions to ensure clarity and consistency Ability to create visually appealing, responsive and intuitive dashboards. Experience of migrating dashboards from QlikView to Power BI Build tabular and multidimensional models aligned with data warehouse standards. Develop Analysis Services (SSAS) reporting models. Connect to various data sources, import and transform data using Power Query/M , DAX , and Power BI tools. Implement row-level security and understand application security models within Power BI. Perform advanced DAX calculations for data manipulation and insight generation. Optimize Power BI performance and troubleshoot dashboard/report issues. Translate business needs into data-driven reports and visual storytelling . Ensure data governance, quality, and security best practices. Document design methodology, technical specifications, and project deliverables. Engage with business stakeholders to gather requirements and deliver analytics solutions. Analyze large datasets and present actionable insights to client teams. Mandatory Skills & Experience: 3 + years of experience in Power BI report development and BI roles. Knowledge of other tools like Qlikview and migration to Power BI Strong knowledge of SQL , query performance tuning, and data modelling . Proven experience with Microsoft BI Stack - Power BI, SSAS, SSRS, SSIS. Solid understanding of relational and multidimensional database design . Familiarity with the full Power BI ecosystem: Power BI Premium, Power BI Service, Power BI Server, Power Query , etc. Knowledge of presentation tools and the ability to create executive-level presentations . Strong analytical thinking, problem-solving , and attention to detail. Excellent communication skills for stakeholder interaction and requirements gathering. Comfortable creating reports from wireframes and functional requirements . Strong business acumen and ability to derive insights that incite action. Nice to Have (Preferred): Understanding of ETL processes , data pipeline architecture , and data warehousing on Azure. Experience in distributed systems for data extraction, ingestion, and processing at scale. Soft Skills: Resilient under pressure and deadlines. Proactive, self-driven attitude with strong ownership. Excellent team collaboration and communication. Client-focused mindset with a desire to grow in a dynamic environment. Job Type: Full-time Pay: ₹600,000.00 - ₹1,000,000.00 per year Schedule: Day shift Ability to commute/relocate: Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Post selection, can you join immediately or within 30 days? Experience: Power BI: 2 years (Required) QlikView: 1 year (Preferred) Work Location: In person

Posted 4 days ago

Apply

Exploring Query Jobs in India

The job market for query professionals in India is thriving, with a high demand for individuals who are skilled at querying databases and extracting valuable insights from data. Companies across various industries are constantly seeking talented individuals to fill query roles and drive their data-driven decision-making processes.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for query professionals in India varies based on experience levels: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

Typically, a career in query roles progresses as follows: - Data Analyst - Database Developer - Business Intelligence Developer - Data Engineer - Data Architect

Related Skills

Apart from proficiency in querying databases, individuals in query roles are often expected to have or develop skills in: - Data visualization - Data manipulation - SQL database management - ETL (Extract, Transform, Load) processes

Interview Questions

  • What is the difference between SQL and NoSQL databases? (basic)
  • Explain the difference between INNER JOIN and OUTER JOIN in SQL. (medium)
  • How do you optimize a query for better performance? (medium)
  • What are indexes in a database and why are they important? (medium)
  • Describe a time when you had to troubleshoot a complex query. (advanced)
  • Write a query to find the second highest salary in a table. (advanced)
  • Explain the ACID properties of a transaction in a database. (medium)
  • What is a subquery and how is it different from a regular query? (medium)
  • How do you handle missing data in a database query? (medium)
  • What is normalization and why is it important in database design? (basic)
  • ... and more

Closing Remark

As you explore opportunities in query roles in India, remember to continuously upskill and stay updated with the latest trends in data querying. Prepare yourself thoroughly for interviews by practicing common query-related questions and showcase your expertise confidently. Best of luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies