Home
Jobs

1831 Querying Jobs - Page 9

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Linkedin logo

Job Summary Synechron is seeking a knowledgeable and experienced Senior Business Analyst to join our team. The role involves analyzing business processes, gathering requirements, and facilitating effective communication between stakeholders and technical teams. This position contributes to delivering value-driven solutions aligned with organizational goals, ensuring clarity and efficiency throughout project lifecycles. The ideal candidate will bring a solid understanding of business analysis fundamentals and a proven track record of over 7 years in the field. Software Requirements Required Skills: Proficiency in MS Office Suite (Word, Excel, PowerPoint) — advanced knowledge Experience with documentation management tools (e.g., SharePoint, Confluence) Familiarity with modeling tools (e.g., UML, BPMN diagrams) Preferred Skills: Data analysis tools (e.g., Tableau, Power BI) Requirements management tools (e.g., JIRA, Rational DOORS) Overall Responsibilities Elicit, analyze, and document business requirements and processes Collaborate with stakeholders to understand their needs and translate them into clear requirements Facilitate communication between business units and technical teams to ensure clarity and alignment Support project teams throughout the software development lifecycle by providing detailed documentation and analysis Identify process improvements and recommend solutions that enhance efficiency Assist in testing activities and ensure delivered solutions meet defined requirements Participate in stakeholder meetings, providing updates on analysis progress and issues Ensure project deliverables align with organizational goals and standards Performance outcomes: Accurate and comprehensive requirements documentation Smooth collaboration across teams resulting in timely delivery Increased stakeholder satisfaction through clear communication and effective solutions Enhanced process efficiencies and innovative solutions Technical Skills (By Category) Programming Languages: Not directly required; basic understanding of scripting or data querying (e.g., SQL) is advantageous Databases/Data Management: Basic knowledge of relational databases and data analysis — preferred Cloud Technologies: Not mandatory; familiarity with cloud concepts beneficial Frameworks and Libraries: Not applicable Development Tools and Methodologies: Requirements management (JIRA, Confluence) — essential Business process modeling (UML, BPMN) — essential Agile methodologies — preferred Security Protocols: Not directly applicable but understanding data privacy and security principles is advantageous Experience Requirements Minimum of 7+ years working as a Business Analyst or in related roles Strong domain understanding, with exposure to relevant business contexts Prior experience in financial services, banking, or similar industries is preferred Proven ability to liaison effectively between technical and non-technical stakeholders Alternative paths include experience in consulting, process analysis, or project coordination with relevant domain exposure Day-to-Day Activities Gather requirements through interviews, workshops, and documentation review Develop detailed business process models and functional specifications Conduct requirement reviews with stakeholders and technical teams Facilitate communication and clarification of project scope and objectives Support system testing and validation activities Track requirements status and changes using approved tools Participate in project meetings, providing analysis support and progress updates Continuous stakeholder engagement to ensure alignment and transparency Qualifications Bachelor's degree in Business Administration, Information Systems, or related field; equivalent professional experience acceptable Certifications such as CBAP, CCBA, or PMI-PBA are preferred Hands-on experience with requirements elicitation, modeling, and documentation techniques Proven track record in managing multiple stakeholder priorities in dynamic environments Professional Competencies Critical thinking and analytical skills to solve complex problems Effective communication and active listening to engage diverse stakeholders Ability to lead discussions and facilitate workshops Adaptability to changing project needs and business landscapes Demonstrated organizational and time management skills Commitment to continuous learning and process improvement S YNECHRON’S DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. Candidate Application Notice Show more Show less

Posted 4 days ago

Apply

2.0 years

5 - 10 Lacs

India

On-site

Key Responsibilities: Application Development: Design and develop enterprise applications using the Joget platform, ensuring robust, scalable, and user-friendly solutions. Customization: Customize Joget forms, workflows, plugins, and UI components to meet business requirements. Process Automation: Analyze and implement business process automation workflows, enhancing operational efficiency and reducing manual efforts. Integration: Integrate Joget applications with third-party systems, APIs, and enterprise tools to enable seamless data exchange. Performance Optimization: Optimize Joget applications for performance, scalability, and security. Collaboration: Work closely with business analysts, project managers, and other stakeholders to gather and refine requirements. Testing & Debugging: Conduct thorough testing, troubleshooting, and debugging to ensure application stability and quality. Documentation: Maintain comprehensive technical documentation for all development activities. Mentorship: Provide guidance and mentorship to junior developers as needed. Core Technical Skills: Joget Platform Expertise- Proficiency in Joget Workflow platform for designing and developing forms, workflows, data lists, and user views. Experience in creating and managing custom Joget plugins . Expertise in workflow automation and process configuration. Knowledge of Joget’s built-in components , templates, and modular features. Programming and Development- Strong knowledge of Java for back-end customizations and plugin development. Proficiency in JavaScript , HTML , and CSS for front-end customizations. Experience in SQL for database querying and management. Familiarity with XML and JSON for data handling. Integration and APIs- Hands-on experience integrating Joget applications with third-party systems using REST and SOAP APIs . Knowledge of OAuth , JWT , and other authentication mechanisms for secure integrations. Experience in handling data exchange between Joget and external systems. Database Management- Proficiency in relational databases such as MySQL , PostgreSQL , or Oracle . Experience in writing and optimizing complex SQL queries . Knowledge of database performance tuning and troubleshooting. Deployment and Infrastructure- Familiarity with cloud platforms like AWS, Azure, or Google Cloud for Joget deployment. Experience in Docker or other containerization tools for application hosting. Joget Deployment on Multiple Operating Systems and Databases Knowledge of CI/CD pipelines and deployment automation using tools like Jenkins or GitHub Actions. Debugging and Performance Optimization- Strong skills in troubleshooting Joget applications to identify and resolve issues. Experience in performance optimization of Joget workflows and UI components. Familiarity with Joget’s logging and monitoring tools for system analysis. Security- Understanding of application security best practices , including data encryption, role-based access control, and user authentication. Familiarity with secure coding practices and compliance standards. Job Type: Full-time Pay: ₹500,000.00 - ₹1,000,000.00 per year Benefits: Flexible schedule Health insurance Provident Fund Schedule: Day shift Supplemental Pay: Yearly bonus Ability to commute/relocate: Mohali district, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: joget: 2 years (Required) Work Location: In person

Posted 4 days ago

Apply

0 years

2 - 6 Lacs

Bhubaneshwar

On-site

Key Responsibilities: Database Installation and Configuration: Install, configure, and manage PostgreSQL databases. Optimize database performance through tuning and configuration. Implement data replication and high availability strategies. Data Modeling: Design and implement PostgreSQL data models that align with application requirements. Optimize data schema for efficient querying and updates. Performance Tuning: Monitor database performance and identify bottlenecks. Tune PostgreSQL settings to improve query response times and throughput. Optimize data distribution and indexing strategies. Backup and Recovery: Implement and maintain backup and recovery procedures. Test disaster recovery plans regularly. Security & Troubleshooting: Implement security measures to protect PostgreSQL data. Manage access controls and permissions. Monitor for security threats and vulnerabilities. Diagnose and resolve PostgreSQL-related issues. Analyze logs and metrics to identify problems. Provide technical support to development teams. Requirements Required Skills and Qualifications: Strong understanding of PostgreSQL architecture and concepts. Experience with relational databases and data modelling. Proficiency in scripting languages (e.g., Bash, Python). Knowledge of database administration concepts (e.g., backups, replication, tuning). Experience with performance tuning & optimization. Problem-solving and troubleshooting skills. Strong communication and collaboration skills. Preferred Skills and Qualifications: Certifications in PostgreSQL (e.g., PostgreSQL Certified Professional). Experience with cloud-based PostgreSQL deployments. Familiarity with other relational databases (e.g., MySQL, Oracle).

Posted 4 days ago

Apply

2.0 years

10 Lacs

India

On-site

For Android Development (Java and Kotlin) For iOS Development (Swift and Objective-C) JavaScript (for hybrid or cross-platform app development) HTML and CSS (for hybrid app development) C/C++ (for low-level performance optimizations or interaction with hardware components on specific devices.) SQL (for managing and querying databases) Python (valuable for various scripting tasks, data analysis, and server-side components) Ruby (for mobile developers who work on server-side components or backend services.) PHP (PHP is ideal for mobile app developers who deal with server-side scripting or backend services SwiftUI and Combine (iOS) (for creating user interfaces and handling asynchronous operations.) Android Jetpack (Android) (Provides a set of modern tools and libraries to simplify app development.) Dart (Flutter)(primary language for building Flutter apps.) Knowledge of cross-platform frameworks like React Native and Flutter Mobile developers should have knowledge of mobile app security Job Type: Full-time Pay: Up to ₹1,000,000.00 per year Benefits: Cell phone reimbursement Schedule: Day shift Experience: Java: 2 years (Preferred) JavaScript: 2 years (Preferred) SQL: 1 year (Preferred) Flutter: 1 year (Preferred) Work Location: In person

Posted 4 days ago

Apply

3.0 years

6 - 9 Lacs

Chennai

On-site

Join the Ford HR Management Security & Controls department! Our team is dedicated to ensuring robust and secure user access management for Human Resource (HR) applications in our global environment. We are responsible for the tools and processes that allow HR staff, Ford employees, and suppliers to request and authorize access efficiently and securely. We also maintain critical interfaces that connect our access management systems to various downstream applications. A key focus area for our team is the configuration and management of security roles within our global HR system, Oracle HCM. Oracle HCM (Human Capital Management) is Ford's comprehensive global HR platform. This includes Core HR processes (like employee data management, promotions, and internal transfers), as well as Compensation, Learning & Development, Talent Management, Recruiting and Payroll. We are looking for a skilled and experienced IT Analyst/Specialist with deep knowledge of Oracle HCM, particularly its security and access management capabilities. This role is critical to ensuring the integrity and security of our HR data and systems. You will also leverage your skills in SQL and Informatica PowerCenter to support data analysis, reporting, and ETL processes vital to our operations. You'll be joining a dynamic, globally distributed IT team with members located in the US, India, and Germany, collaborating across regions to achieve our shared goals. Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience. 3+ years experience with Oracle HCM, with a strong focus on security configuration and user access management. 3+ years experience with SQL for data querying, analysis, and manipulation. Hands-on experience designing, developing, and maintaining ETL processes (e.g. by using Informatica IICS). Understanding of data security principles and best practices, especially in an HR context. Experience troubleshooting complex technical issues related to access, security, or data integration. Strong analytical and problem-solving skills. Excellent communication and collaboration skills, comfortable working with global teams across different time zones. Desired Skills: Experience with other Oracle HCM Security Module Experience with other Oracle technologies or modules within HCM (e.g., Oracle BI Publisher). Experience working in a large, global enterprise environment. Configure, manage, and maintain security roles, profiles, and permissions within the global Oracle HCM system, ensuring compliance with security policies. Design, develop, and maintain Extract, Transform, Load (ETL) processes using Informatica PowerCenter to move and integrate data from various sources. Utilize SQL for data extraction, analysis and validation. Collaborate closely with HR functional teams and other IT teams to understand security and data requirements. Ensure implemented solutions adhere to security best practices and internal controls.

Posted 4 days ago

Apply

3.0 years

3 - 7 Lacs

Chennai

On-site

Reveleer is a healthcare data and analytics company that uses Artificial Intelligence to give health plans across all business lines greater control over their Quality Improvement, Risk Adjustment, and Member Management programs. With one transformative solution, the Reveleer platform enables plans to independently execute and manage every aspect of enrollment, provider outreach and data retrieval, coding, abstraction, reporting, and submissions, ensuring full compliance with HIPAA and data governance standards. Leveraging proprietary technology, robust data sets, and subject matter expertise, Reveleer provides complete record retrieval and review services so health plans can confidently plan and execute risk, quality, and member management programs to deliver more value and improved outcomes. WHAT YOU'LL DO: Perform end-to-end analysis on healthcare data in tabular format or natural language data, ensuring strict compliance with HIPAA and data privacy regulations Independently identify problems, QA data, architect solutions and conduct analysis to support data-driven decision-making processes Collaborate with product managers, data scientists and data engineers, in an Agile and technology-driven environment Support senior data analyst and data scientists with data processing Generate internal data reports, ensuring clarity and alignment with organizational goals while adhering to HIPAA and data protection regulations Present in front of stakeholders including senior leadership, ensuring transparency and actionable recommendations ABOUT YOU: At least 3 years of work experience in analytics required, ideally in the healthcare or health plan sector, with a strong understanding of data privacy and governance frameworks such as HIPAA Strong analytical skills and critical thinking Strong knowledge of Excel calculations and data visualizations Strong SQL skills, experience of querying large datasets Experience of data visualization software (e.g., Power BI & Looker) Experience of generating automatic Power BI reports Strong statistical ability to analyze large amounts of data Experience of data wrangling and data visualization Desire and ability to build trust with business stakeholders, manage the relationship and socialize insights effectively Be able to communicate analytical insights with a variety of business stakeholders across different technical levels (including senior leaders) Self-motivated and be able to work independently Be able to work towards deadline NICE TO HAVE: Experience in Python Experience of cloud (AWS preferred) Experience in Quality or Risk Adjustment

Posted 4 days ago

Apply

2.0 years

0 Lacs

Chennai

On-site

The Global Data Insights and Analytics (GDI&A) department at Ford Motors Company is looking for qualified people who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, Econometrics, and Optimization.The candidate should possess the ability to translate a business problem into an analytical problem, identify the relevant data sets needed for addressing the analytical problem, recommend, implement, and validate the best suited analytical algorithm(s), and generate/deliver insights to stakeholders. Candidates are expected to regularly refer to research papers and be at the cutting-edge with respect to algorithms, tools, and techniques. The role is that of an individual contributor; however, the candidate is expected to work in project teams of 2 to 3 people and interact with Business partners on regular basis. Master's degree in computer science, Operational research, Statistics, Applied mathematics, or in any other engineering discipline. Proficient in querying and analyzing large datasets using BigQuery on GCP. Strong Python skills for data wrangling and automation. 2+ years of hands-on experience in Python programming for data analysis, machine learning, and with libraries such as NumPy, Pandas, Matplotlib, Scikit-learn, TensorFlow, PyTorch, NLTK, spaCy, and Gensim. 2+ years of experience with both supervised and unsupervised machine learning techniques. 2+ years of experience with data analysis and visualization using Python packages such as Pandas, NumPy, Matplotlib, Seaborn, or data visualization tools like Dash or QlikSense. 1+ years' experience in SQL programming language and relational databases. Understand business requirements and analyze datasets to determine suitable approaches to meet analytic business needs and support data-driven decision-making by FCSD business team Design and implement data analysis and ML models, hypotheses, algorithms and experiments to support data-driven decision-making Apply various analytics techniques like data mining, predictive modeling, prescriptive modeling, math, statistics, advanced analytics, machine learning models and algorithms, etc.; to analyze data and uncover meaningful patterns, relationships, and trends Design efficient data loading, data augmentation and data analysis techniques to enhance the accuracy and robustness of data science and machine learning models, including scalable models suitable for automation Research, study and stay updated in the domain of data science, machine learning, analytics tools and techniques etc.; and continuously identify avenues for enhancing analysis efficiency, accuracy and robustness

Posted 4 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading recruitment agency specializing in connecting top talent with premier companies across India. Our mission is to facilitate meaningful career opportunities while maintaining a commitment to integrity, excellence, and client satisfaction. Our culture promotes collaboration and innovation, ensuring both clients and candidates receive the highest level of service. Job Title: Ab Initio Developer Location: India (On-Site) Role Responsibilities Design and develop ETL processes using Ab Initio. Create and manage data workflows to ensure data accuracy and quality. Collaborate with data architects and analysts to gather requirements. Optimize existing ETL processes for performance improvements. Perform debugging and troubleshooting of ETL jobs and workflows. Implement data cleansing and transformation processes. Monitor and maintain ETL systems and troubleshoot issues as they arise. Generate documentation for all ETL processes and workflows. Assist in data migration projects and data integration efforts. Work closely with business stakeholders to identify data needs. Participate in code reviews and ensure best practices are followed. Update and maintain metadata repositories as required. Provide training and support to junior developers and team members. Develop and execute unit test cases to validate ETL processes. Stay updated with the latest Ab Initio features and enhancements. Qualifications Bachelor’s degree in Computer Science, IT, or related field. 3+ years of experience in Ab Initio development. Strong knowledge of ETL processes and data warehousing concepts. Proficient in SQL for data manipulation and querying. Experience with Unix/Linux operating systems. Familiarity with data modeling concepts and practices. Ability to work in a fast-paced, collaborative environment. Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Experience with performance tuning of ETL jobs. Ability to handle multiple projects simultaneously. Keen attention to detail. Experience with version control tools such as Git. Knowledge of data governance and security practices. Ability to work independently with minimal supervision. Willingness to learn new technologies and frameworks. We invite passionate and skilled Ab Initio Developers to join us at Viraaj HR Solutions. This on-site role in India offers an exciting opportunity to work on innovative projects and contribute to impactful data solutions. Skills: data modeling,version control (git),data governance,performance tuning,sql,troubleshooting,data warehousing,etl,data processing,analytical thinking,team collaboration,data analysis,data workflows,ab initio,etl processes,unix/linux Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading recruitment consultancy focused on connecting businesses with top talent across various industries. Our mission is to deliver exceptional HR solutions tailored to the unique needs of our clients, contributing to their success through strategic hiring practices. We value integrity, commitment, and excellence in our work culture, ensuring a supportive environment for both our clients and candidates. Role Responsibilities Design and implement robust data pipelines using Python and Pyspark. Develop and maintain data models that support organizational analytics and reporting. Work closely with data scientists and analysts to understand data requirements and translate them into technical specifications. Integrate and maintain Snowflake for data warehousing solutions. Ensure data quality and integrity through effective ETL processes. Conduct data profiling and performance tuning to optimize system performance. Collaborate with cross-functional teams to define data architecture standards and best practices. Participate in the creation of documentation for data flows and data management best practices. Monitor data pipelines and troubleshoot issues as they arise. Implement security measures to protect sensitive data information. Stay updated with the latest trends and technologies in data engineering. Assist in migrating existing data solutions to cloud-based infrastructures. Support continuous improvement initiatives around data management. Provide technical guidance and mentorship to junior data engineers. Participate in code reviews and adhere to best practices in software development. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 3 years of experience in data engineering or a related role. Proficient in Python programming and Pyspark framework. Experience with Snowflake or similar cloud data warehousing platforms. Strong understanding of ETL principles and data integration techniques. Solid understanding of database design and data modeling concepts. Excellent SQL skills for querying databases and data analysis. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Ability to work collaboratively in cross-functional teams. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Experience with version control systems (e.g., Git). Knowledge of Agile methodologies and project management. A commitment to continuous learning and professional development. Ability to work on multiple projects simultaneously and meet deadlines. Skills: data architecture,etl,git,problem-solving skills,snowflake,python,data engineering,data warehousing,cloud computing,data integration,sql,data modeling,sql proficiency,pyspark,agile methodologies,cloud platforms (aws, azure, google cloud) Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are seeking a highly skilled Senior Software Engineer with expertise in backend development using Node.js , NestJS , and TypeScript . The ideal candidate will have a strong background in designing and building scalable systems, working with SQL databases , and deploying solutions on AWS . You will also play a key role in mentoring junior engineers, driving architectural decisions, and debugging complex issues. Key Responsibilities Develop, maintain, and optimize scalable backend services using Node.js, NestJS, and TypeScript. Design and implement robust database schemas and queries using SQL databases. Architect and build cloud-native applications leveraging AWS services. Perform in-depth debugging and troubleshooting to resolve complex production issues. Collaborate with cross-functional teams to define and deliver high-quality software solutions. Mentor and guide junior engineers to improve their technical skills and code quality. Participate in code reviews, design discussions, and architectural decisions. Ensure best practices in coding, testing, and deployment. Continuously research and adopt new technologies to enhance development efficiency and product performance. Required Skills & Qualifications Extensive experience in Node.js, NestJS, TypeScript, and JavaScript development. Strong proficiency with SQL databases (design, optimization, and querying). Hands-on experience with AWS cloud services (e.g., EC2, Lambda, S3, RDS). Proven ability to debug and resolve complex software issues efficiently. Solid understanding of software architecture principles and design patterns. Experience mentoring junior developers and fostering team growth. Familiarity with CI/CD pipelines, automated testing, and version control systems. Excellent problem-solving and communication skills. Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). Preferred Qualifications Experience with microservices architecture. Knowledge of other databases (NoSQL, Redis, etc.). Familiarity with containerization (Docker, Kubernetes). Experience with Agile methodologies. At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations. Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Location: Ahmedabad Experience: 4–8 years Job Overview We are looking for a detail-oriented and proactive QA Engineer to join our night shift operations team. The ideal candidate will be responsible for ensuring the quality and reliability of our systems, including customer support tools (Freshdesk), search and analytics platforms (OpenSearch), and databases (MongoDB and Dgraph). This role requires hands-on experience in functional, regression, and API testing, as well as a basic understanding of NoSQL systems and graph databases. Key Responsibilities Perform manual and automated testing on web applications, APIs, and backend systems. Create, maintain, and execute detailed test cases, test plans, and bug reports. Work closely with the support team to reproduce, investigate, and escalate production issues raised via Freshdesk tickets. Validate search accuracy and performance in OpenSearch. Verify data consistency and integrity in MongoDB and Dgraph environments. Monitor nightly builds, conduct smoke/sanity checks, and ensure timely reporting of issues. Participate in nightly deployment verifications and post-release testing. Collaborate with cross-functional teams including developers, DevOps, and customer success. Required Skills & Qualifications 4+ years of experience in Quality Assurance or Software Testing. Experience testing and validating APIs using Postman or similar tools. Familiarity with Freshdesk or similar ticketing/support platforms. Working knowledge of OpenSearch/Elasticsearch, understanding of query validation and search response accuracy. Basic querying and validation skills in MongoDB and Dgraph. Understanding of JSON, logs, and backend data flows. Experience working in night shifts or supporting international time zones. Strong communication and documentation skills. Knowledge of JIRA, TestRail, or equivalent test management tools. Nice To Have Exposure to automation frameworks (e.g., Selenium, Cypress). Scripting knowledge (Python, JavaScript, etc.). Familiarity with CI/CD pipelines and tools like Jenkins/GitHub Actions. Experience in a SaaS or B2B product environment. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About the Role We are looking for a skilled and motivated Data Analyst with 2–5 years of experience to join our team. In this role, you will work closely with the product team to support strategic decision-making by delivering data-driven insights, dashboards, and performance reports. Your ability to transform raw data into actionable insights will directly impact on how we build and improve our products. Key Responsibilities Collaborate with the product team to understand data needs and define key performance indicators (KPIs) Develop and maintain insightful reports and dashboards using Power BI Write efficient and optimized SQL queries to extract and manipulate data from multiple sources Perform data analysis using Python and pandas for deeper trend analysis and data modeling Present findings clearly through visualizations and written summaries to stakeholders Ensure data quality and integrity across reporting pipelines Contribute to ongoing improvements in data processes and tooling Required Skills & Experience 2–5 years of hands-on experience as a Data Analyst or in a similar role Strong proficiency in SQL for querying and data manipulation Experience in building interactive dashboards with Power BI Good command of Python , especially with pandas for data wrangling and analysis Experience with Databricks or working with big data tools Understanding of Medallion Architecture and its application in analytics pipelines Strong communication and collaboration skills, especially in cross-functional team settings Good to Have Familiarity with data engineering practices , including: Data transformation using Databricks notebooks Apache Spark SQL for distributed data processing Azure Data Factory (ADF) for orchestration Version control using Git Exposure to product analytics, cohort analysis, or A/B testing methodologies Interested candidates please share your resume with balaji.kumar@flyerssoft.com Show more Show less

Posted 4 days ago

Apply

7.0 - 9.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

The purpose of this role is to understand, model and facilitate change in a significant area of the business and technology portfolio either by line of business, geography or specific architecture domain whilst building the overall Architecture capability and knowledge base of the company. Job Description: Role Overview : We are seeking a highly skilled and motivated Cloud Data Engineering Manager to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. The GCP Data Engineering Manager will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs. Key Responsibilities : Data Engineering & Development : Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. Develop and optimize data architectures that support real-time and batch data processing. Build, optimize, and maintain CI/CD pipelines using tools like Jenkins, GitLab, or Google Cloud Build. Automate testing, integration, and deployment processes to ensure fast and reliable software delivery. Cloud Infrastructure Management : Manage and deploy GCP infrastructure components to enable seamless data workflows. Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices. Infrastructure Automation and Management: Design, deploy, and maintain scalable and secure infrastructure on GCP. Implement Infrastructure as Code (IaC) using tools like Terraform. Manage Kubernetes clusters (GKE) for containerized workloads. Collaboration and Stakeholder Engagement : Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals. Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation. Quality Assurance & Optimization : Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations. Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines. Monitor and optimize pipeline performance to meet SLAs and minimize operational costs. Qualifications and Certifications : Education: Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field. Experience: Minimum of 7 to 9 years of experience in data engineering, with at least 4 years working on GCP cloud platforms. Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. Certifications: Google Cloud Professional Data Engineer certification preferred. Key Skills : Mandatory Skills: Advanced proficiency in Python for data pipelines and automation. Strong SQL skills for querying, transforming, and analyzing large datasets. Strong hands-on experience with GCP services, including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine and Kubernetes Engine (GKE). Hands-on experience with CI/CD tools such as Jenkins, GitHub or Bitbucket. Proficiency in Docker, Kubernetes, Terraform or Ansible for containerization, orchestration, and infrastructure as code (IaC) Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer Strong understanding of Agile/Scrum methodologies Nice-to-Have Skills: Experience with other cloud platforms like AWS or Azure. Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau). Understanding of machine learning workflows and their integration with data pipelines. Soft Skills : Strong problem-solving and critical-thinking abilities. Excellent communication skills to collaborate with technical and non-technical stakeholders. Proactive attitude towards innovation and learning. Ability to work independently and as part of a collaborative team. Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading recruitment consultancy focused on connecting businesses with top talent across various industries. Our mission is to deliver exceptional HR solutions tailored to the unique needs of our clients, contributing to their success through strategic hiring practices. We value integrity, commitment, and excellence in our work culture, ensuring a supportive environment for both our clients and candidates. Role Responsibilities Design and implement robust data pipelines using Python and Pyspark. Develop and maintain data models that support organizational analytics and reporting. Work closely with data scientists and analysts to understand data requirements and translate them into technical specifications. Integrate and maintain Snowflake for data warehousing solutions. Ensure data quality and integrity through effective ETL processes. Conduct data profiling and performance tuning to optimize system performance. Collaborate with cross-functional teams to define data architecture standards and best practices. Participate in the creation of documentation for data flows and data management best practices. Monitor data pipelines and troubleshoot issues as they arise. Implement security measures to protect sensitive data information. Stay updated with the latest trends and technologies in data engineering. Assist in migrating existing data solutions to cloud-based infrastructures. Support continuous improvement initiatives around data management. Provide technical guidance and mentorship to junior data engineers. Participate in code reviews and adhere to best practices in software development. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 3 years of experience in data engineering or a related role. Proficient in Python programming and Pyspark framework. Experience with Snowflake or similar cloud data warehousing platforms. Strong understanding of ETL principles and data integration techniques. Solid understanding of database design and data modeling concepts. Excellent SQL skills for querying databases and data analysis. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Ability to work collaboratively in cross-functional teams. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Experience with version control systems (e.g., Git). Knowledge of Agile methodologies and project management. A commitment to continuous learning and professional development. Ability to work on multiple projects simultaneously and meet deadlines. Skills: data architecture,etl,git,problem-solving skills,snowflake,python,data engineering,data warehousing,cloud computing,data integration,sql,data modeling,sql proficiency,pyspark,agile methodologies,cloud platforms (aws, azure, google cloud) Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role Objectives Facilitate and coordinate Agile Scrum processes within the company. Guide and coach the team on Agile Scrum principles and practices. Ensure the team works collaboratively and efficiently to deliver high-quality products. Tasks Facilitate Scrum ceremonies such as daily stand-up meetings, sprint planning, sprint review, and sprint retrospective. Identify and troubleshoot obstacles that are hindering the team’s progress. Promote continuous improvement through retrospectives and feedback sessions. Communicate project status, risks and issues to stakeholders. Track KPIs and help teams deliver high-quality products/solutions on time. Ensure that the team follows the Scrum framework and adheres to Agile principles. Required skills and qualifications Bachelor’s degree in Computer Science or a related field. Minimum of 5+ years of experience as a Scrum Master. Strong knowledge of Agile Scrum principles and practices. Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to lead and facilitate Scrum ceremonies. Preferred skills and qualifications General conceptual understanding of programming and DB querying. Certification in Agile methodologies, like Certified Scrum Master (CSM) or Professional Scrum Master (PSM). SAFe certification and Kanban certification are a plus. Ability to work collaboratively with cross-functional teams. Prior knowledge of Agile project management tools, such as Jira. Knowledge of Discrete Manufacturing scenario and SAP DMC are a plus. Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Opportunity at Adintors Private Limited! We are hiring MERN Stack Developer!! Experience: 6+ years Education: Information & Technology Engineer / Computer Engineer. CTC: It will be dependent on individual skills and knowledge. Industry Type: Software Development Desired Candidate Profile: • Proficiency in JavaScript​: ECMAScript (ES6+), and TypeScript, including experience with asynchronous programming, functional programming, and modern JavaScript features like modules and destructing. • Strong understanding of TypeScript for implementing static typing, interfaces, and improving code quality. • React: Proficient in building user interfaces using React, including core concepts like components, state management, JSX syntax, and client-side routing with React Router. Experience with Redux for managing application state effectively. • Node.js: Experienced in server-side development using Node.js, with a strong understanding of event-driven architecture, the Express framework for building APIs, and working with modules like fs and path. • MongoDB: Skilled in using MongoDB, a NoSQL database, with expertise in data modeling, querying, and working with the MongoDB Node.js driver or an ORM like Mongoose. • PostgreSQL: Proficient in PostgreSQL, a relational database, including schema design, writing efficient SQL queries, and integrating it with Node.js applications. • RESTful APIs: Experienced in designing and building RESTful APIs with Node.js and Express, including handling HTTP methods (GET, POST, PUT, DELETE), status codes, request/response management, authentication, and validation. • Web Sockets: Experienced in implementing real-time communication using WebSockets for applications that require features like live chat, notifications, or real-time updates. • Microservices: Knowledgeable in designing and developing microservices architectures, ensuring scalability, fault tolerance, and independent service management for complex applications. • UI Development: Proficient in HTML, CSS, and CSS frameworks like Bootstrap, Material-UI, and Tailwind CSS, alongside React, to build responsive and visually appealing user interfaces. • Version Control: Proficient in Git for version control, including commands, branching, merging, and best practices for collaborative development. • Testing: Experienced with testing frameworks like Jest or Mocha for unit and integration tests, with a focus on test-driven development (TDD) for maintainable code. • Deployment and DevOps: Knowledgeable in deploying applications using Docker, Kubernetes, and cloud platforms (AWS, Azure, Heroku). • Problem-Solving and Debugging: Strong problem-solving and debugging skills using tools and techniques for efficient issue resolution. Perks & Benefits: • 5 days working. • Opportunity to work in multicultural team and project. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Business Analyst (IT Domain – ERP & Complex Systems) Experience: 5 to 7 Years Location: Mumbai / Pune (Onsite) Employment Type: Full-Time Job Summary: We are looking for a dynamic and detail-oriented IT Business Analyst with 5–7 years of experience to join our team onsite in Mumbai or Pune. The ideal candidate must have hands-on experience in working with ERP or complex systems (excluding Insurance, Banking, Healthcare, and LMS domains) and demonstrate excellent skills in bridging business requirements with technical solutions in an Agile/Scrum environment. Key Responsibilities: Understand business needs and translate them into detailed use cases, user stories, and tasks . Liaise between clients, development teams, UX, QA, and support teams to ensure seamless communication and solution delivery. Prepare BRDs, functional specifications, flow diagrams , and support documents for development and QA teams. Collaborate closely with the UX/UI team to review and improve user experience and interface designs. Conduct and manage sprint planning, daily scrums , and proactively identify blockers. Manage and control scope changes , and maintain consistent progress reporting to stakeholders. Conduct UAT, product demos , and provide post-go-live support for issue resolution. Monitor testing tasks, audit deliverables, and ensure quality and compliance throughout the SDLC. Provide responses to routine client queries and follow up for resolution. Required Skills and Experience: 5+ years of experience as a Business Analyst in IT projects. Strong understanding of SDLC , Agile/Scrum methodologies. Proficient in using tools like Jira for backlog and project tracking. Experience with Web and Windows-based applications . Familiarity with SQL or similar querying tools for data analysis. Strong analytical, verbal, and written communication skills . Show more Show less

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Company : Our Client is a leading Indian multinational IT services and consulting firm. It provides digital transformation, cloud computing, data analytics, enterprise application integration, infrastructure management, and application development services. The company caters to over 700 clients across industries such as banking and financial services, manufacturing, technology, media, retail, and travel & hospitality. Its industry-specific solutions are designed to address complex business challenges by combining domain expertise with deep technical capabilities. With a global workforce of over 80,000 professionals and a presence in more than 50 countries. Job Title: Python Developer Locations: PAN INDIA Experience: 5-10 Years Employment Type: Contract to Hire Work Mode : Work From Office Notice Period : Immediate to 15 Days Job Description: We are seeking a proactive and skilled Python Developer to support and enhance our existing codebase hosted on Azure Function App. The ideal candidate will have a strong foundation in Python, APIs, and basic data science principles, with the ability to understand and modify machine learning-driven computations. This role involves maintaining and evolving cloud-hosted Python applications and collaborating with cross-functional teams. Key Responsibilities: Understand and maintain Python code hosted on Azure Function App. Modify and enhance existing code based on business requirements. Support basic data analysis and transformation logic using libraries like Pandas, NumPy, Scikit-learn, etc. Collaborate with DevOps and data engineering teams to ensure seamless deployment and integration. Perform basic statistical and mathematical analysis to support data-driven features. Troubleshoot and resolve issues in production and development environments. (Mainly on Function App and Python code) Document code changes and maintain version control using Azure DevOps. Mandatory Skills: Programming: Strong proficiency in Python. Data Science Libraries: Hands-on experience with Pandas, NumPy, Scikit-learn, etc. Mathematics & Statistics: Basic understanding of statistical concepts and mathematical computations. Version Control: Familiarity with Git and Azure DevOps. Good to Have Skills: Cloud & DevOps: Azure DevOps pipelines, CI/CD practices. Data Platforms: Experience with Snowflake and SQL-based querying. Monitoring & Logging: Application Insights, Log Analytics, Kusto Query Language (KQL) Machine Learning: Exposure to ML workflows and model deployment (basic level). Security & Authentication: Understanding of OAuth, API keys, and secure coding practices. Cloud Functions: Experience with Azure Function App (or AWS Lambda) Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Backend Developer - Python Job Type: Full-time Location: On-site, Hyderabad, Telangana, India Job Summary: Join one of our top customer's team as a Backend Developer and help drive scalable, high-performance solutions at the intersection of machine learning and data engineering. You’ll collaborate with skilled professionals to design, implement, and maintain backend systems powering advanced AI/ML applications in a dynamic, onsite environment. Key Responsibilities: Develop, test, and deploy robust backend components and microservices using Python and PySpark. Implement and optimize data pipelines leveraging Databricks and distributed computing frameworks. Design and maintain efficient databases with MySQL, ensuring data integrity and high availability. Integrate machine learning models into production-ready backend systems supporting AI-driven features. Collaborate closely with data scientists and engineers to deliver end-to-end solutions aligned with business goals. Monitor, troubleshoot, and enhance system performance, utilizing Redis for caching and improved scalability. Write clear and maintainable documentation, and communicate effectively with team members both verbally and in writing. Required Skills and Qualifications: Proficiency in Python programming for backend development. Hands-on experience with Databricks and PySpark in a production environment. Strong understanding of MySQL database design, querying, and performance tuning. Practical background in machine learning concepts and deploying ML models. Experience with Redis for caching and state management. Excellent written and verbal communication skills, with a keen attention to detail. Demonstrated ability to work effectively in an on-site, collaborative setting in Hyderabad. Preferred Qualifications: Previous experience in high-growth AI/ML or data engineering projects. Familiarity with additional backend technologies or cloud platforms. Demonstrated leadership or mentorship in technical teams. Show more Show less

Posted 4 days ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About US At Particleblack, we drive innovation through intelligent experimentation with Artificial Intelligence. Our multidisciplinary team—comprising solution architects, data scientists, engineers, product managers, and designers—collaborates with domain experts to deliver cutting-edge R&D solutions tailored to your business. Our ecosystem empowers rapid execution with plug-and-play tools, enabling scalable, AI-powered strategies that fast-track your digital transformation. With a focus on automation and seamless integration, we help you stay ahead—letting you focus on your core, while we accelerate your growth Responsibilities & Qualifications Data Architecture Design: Develop and implement scalable and efficient data architectures for batch and real-time data processing.Design and optimize data lakes, warehouses, and marts to support analytical and operational use cases. ETL/ELT Pipelines: Build and maintain robust ETL/ELT pipelines to extract, transform, and load data from diverse sources.Ensure pipelines are highly performant, secure, and resilient to handle large volumes of structured and semi-structured data. Data Quality and Governance: Establish data quality checks, monitoring systems, and governance practices to ensure the integrity, consistency, and security of data assets. Implement data cataloging and lineage tracking for enterprise-wide data transparency. Collaboration with Teams:Work closely with data scientists and analysts to provide accessible, well-structured datasets for model development and reporting. Partner with software engineering teams to integrate data pipelines into applications and services. Cloud Data Solutions: Architect and deploy cloud-based data solutions using platforms like AWS, Azure, or Google Cloud, leveraging services such as S3, BigQuery, Redshift, or Snowflake. Optimize cloud infrastructure costs while maintaining high performance. Data Automation and Workflow Orchestration: Utilize tools like Apache Airflow, n8n, or similar platforms to automate workflows and schedule recurring data jobs. Develop monitoring systems to proactively detect and resolve pipeline failures. Innovation and Leadership: Research and implement emerging data technologies and methodologies to improve team productivity and system efficiency. Mentor junior engineers, fostering a culture of excellence and innovation.| Required Skills: Experience: 7+ years of overall experience in data engineering roles, with at least 2+ years in a leadership capacity. Proven expertise in designing and deploying large-scale data systems and pipelines. Technical Skills: Proficiency in Python, Java, or Scala for data engineering tasks. Strong SQL skills for querying and optimizing large datasets. Experience with data processing frameworks like Apache Spark, Beam, or Flink. Hands-on experience with ETL tools like Apache NiFi, dbt, or Talend. Experience in pub sub and stream processing using Kafka/Kinesis or the like Cloud Platforms: Expertise in one or more cloud platforms (AWS, Azure, GCP) with a focus on data-related services. Data Modeling: Strong understanding of data modeling techniques (dimensional modeling, star/snowflake schemas). Collaboration: Proven ability to work with cross-functional teams and translate business requirements into technical solutions. Preferred Skills: Familiarity with data visualization tools like Tableau or Power BI to support reporting teams. Knowledge of MLOps pipelines and collaboration with data scientists. Show more Show less

Posted 5 days ago

Apply

0.0 - 2.0 years

0 Lacs

Kollam, Kerala

On-site

Indeed logo

Amrita Vishwa Vidyapeetham, Amritapuri Campus is inviting applications from qualified candidates for the post of Data Analyst. For More details contact paikrishnang@am.amrita.edu Job Title Data Analyst Location Kollam, Kerala Required Number 1 Qualification Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, Mathematics, or a related field. Desirable Skills & Capacity 1–2 years of experience working as a data analyst or in a related role Strong proficiency in Python for data analysis (pandas, numpy, matplotlib/ seaborn) Familiarity with SQL for querying databases Experience with data visualization tools such as Power BI, or Tableau Good spoken and written English skills Job Responsibilities Data Cleaning & Preparation Collect, clean, and prepare structured and unstructured data from various sources Handle missing values, outliers, and data formatting issues Data Analysis Perform exploratory data analysis (EDA) to extract insights and trends Work with large datasets to support research and project goals Apply statistical techniques for hypothesis testing and inference Reporting & Visualization Build dashboards and reports to communicate findings effectively Present insights in a clear and compelling manner for internal and external stakeholders Collaboration Work closely with project managers, developers, and education researchers Translate research questions into data-driven analyses and solutions Code Quality & Documentation Write clean, reproducible scripts and notebooks Maintain proper documentation of data sources, workflows, and results Use Git for version control and collaborative development Learning & Improvement Stay updated on current trends and tools in data science Continuously enhance skills through learning and experimentation Job Category Research Last Date to Apply August 2, 2025

Posted 5 days ago

Apply

0.0 - 3.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, India Share Position Summary Futures First is a part of the Hertshten Group, its holding company which has raised the benchmarks for excellence in the international derivatives industry. Futures First benefits from the significant experience of the Hertshten Group in derivatives markets across global financial exchanges. This is an exciting challenge and an excellent opportunity for bright, analytical, highly motivated professionals to join a vibrant and global organization. At Futures First, we are dedicated to empowering our team with cutting‑edge technology, comprehensive training, dependable infrastructure, and ongoing learning opportunities—enabling everyone to produce high‑caliber work while advancing both professionally and personally. Job Profile We are seeking a detail-oriented and analytical Data Analyst to join our team. The ideal candidate will have a strong background in data analysis, MIS reporting, and proficiency in Excel, VBA Macros, SQL, Python, and Power BI/Qlik Sense. This role involves transforming data into actionable insights to support business decisions. Key Responsibilities: Develop, maintain, and automate MIS reports and dashboards to support various business functions. Utilize advanced Excel functions including VBA Macros, for data analysis, reporting and automation. Write complex SQL queries to extract, manipulate, and analyze data from relational databases. Employ Python for data cleaning, analysis, and visualization tasks. Design and implement interactive dashboards and reports using Power BI/Qlik Sense to visualize key performance indicators and trends. Collaborate with cross-functional teams to understand data requirements and deliver insights. Ensure data accuracy and integrity across all reporting platforms. Requirements Education Qualifications Bachelor's or Master’s in any discipline Work Experience Minimum of 3 years of experience in data analysis or a similar role Skill Set Any certification in data analysis would be an added advantage Good analytical, logical and communication skills Proficiency in Microsoft Excel, including advanced functions and VBA Macros. Strong knowledge of SQL and Python for data querying and manipulation. Good to have hands on experience on one of the self-serviced BI tools like Power BI or Qlik Sense. Location: Gurgaon, Haryana Experience: 3+ Years Employment Type: Full-time

Posted 5 days ago

Apply

0.0 years

0 Lacs

Delhi, Delhi

Remote

Indeed logo

Preferable Location(s): New Delhi, India | Mumbai, India | Chennai, India | Bengaluru, India | Pune, India | Hyderabad, India | Gurugram, India | Delhi, India Work Type: Contract About the Role: We’re looking for a motivated and curious Business Analytics Intern who’s eager to build a career in data science or analytics. If you’re someone who gets excited by dashboards, understands how large language models (LLMs) are transforming data workflows, and can wrangle data in Excel with confidence. This internship will provide hands-on experience working with the analytics and operations team, supporting cross-functional coordination, and translating data into actionable insights. You’ll be at the heart of how data drives decisions in a fast-paced, collaborative environment. What You’ll Do (Key Responsibilities): • Maintain and document metadata for dashboards and datasets to ensure traceability and clarity. • Track dashboards regularly to monitor performance metrics and flag unusual trends. • Collaborate with stakeholders across teams to collect data, understand needs, and share insights in a digestible format. • Support the analytics team with ad hoc data-related requests, ensuring timely delivery and accuracy. • Pull data and derive insights from existing systems and dashboards to support decision-making. • Help streamline data workflows by ensuring proper data hygiene and documentation. • Stay up to date with how LLMs and AI tools are used to enhance data operations. What We’re Looking For (Requirements): • A student or recent graduate with a strong interest in data science, analytics, or business intelligence. • Basic understanding of Large Language Models (LLMs) and how they’re applied in modern data tools (e.g., ChatGPT, Code Interpreter). • Proficiency in Excel (pivot tables, lookups, and basic formulas is a must). • SQL knowledge is a plus — you don’t need to be advanced, but understanding the basics of querying data is valuable. • Strong attention to detail and ability to manage multiple tasks. Working hours: 6 hours a day Internship period: 6 months. Location: Remote How to apply for this position? Please fill out the form with the required details. If your profile is shortlisted, our team will reach out to you via email. If you don't find the emails in your inbox, please check your spam folder. Tip: Avoid using AI-generated responses. We want to hear from you! About PriceLabs: PriceLabs is a revenue management solution for the short-term rental and hospitality industry, founded in 2014 and headquartered in Chicago, IL. Our platform helps individual hosts and hospitality professionals optimize their pricing and revenue management, adapting to changing market trends and occupancy levels. With dynamic pricing, automation rules, and customizations, we manage pricing and minimum-stay restrictions for any portfolio size, with prices automatically uploaded to preferred channels. Every day, we price over 500,000+ listings globally across 150+ countries, offering world-class tools like the Base Price Help and Minimum Stay Recommendation Engine. In 2025, we scaled to; 500K+ properties 250+ globally remote team 60K+ customers worldwide 36% diversity Industry awards won: SaasBoomi 2021 The Shortyz 2020 The Shortyz 2023 STRive Awards 2025 We continue to grow exponentially, backed by a strong team to take us to the next level. Why join PriceLabs? We are a remote-first organization and accept work from home as the norm. Work with an industry-leading product that has thousands of customers worldwide, and our customers love the product! (NPS in the 70s, ) Work with a global team (15 countries and counting) of passionate individuals who accept open communication, empowerment, and a shared focus on customer success. We are a freemium product, so marketing leads the charge on customer acquisition. PriceLabs is an equal-opportunity employer. We are committed to providing equal opportunity in all aspects of employment. We do not discriminate based on race, colour, religious creed, national origin, ancestry, sex, age, veteran status, marital status or physical challenges.

Posted 5 days ago

Apply

0.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Senior Data Engineer (Contract) Location: Bengaluru, Karnataka, India About the Role: We're looking for an experienced Senior Data Engineer (6-8 years) to join our data team. You'll be key in building and maintaining our data systems on AWS. You'll use your strong skills in big data tools and cloud technology to help our analytics team get valuable insights from our data. You'll be in charge of the whole process of our data pipelines, making sure the data is good, reliable, and fast. What You'll Do: Design and build efficient data pipelines using Spark / PySpark / Scala . Manage complex data processes with Airflow , creating and fixing any issues with the workflows ( DAGs ). Clean, transform, and prepare data for analysis. Use Python for data tasks, automation, and building tools. Work with AWS services like S3, Redshift, EMR, Glue, and Athena to manage our data infrastructure. Collaborate closely with the Analytics team to understand what data they need and provide solutions. Help develop and maintain our Node.js backend, using Typescript , for data services. Use YAML to manage the settings for our data tools. Set up and manage automated deployment processes ( CI/CD ) using GitHub Actions . Monitor and fix problems in our data pipelines to keep them running smoothly. Implement checks to ensure our data is accurate and consistent. Help design and build data warehouses and data lakes. Use SQL extensively to query and work with data in different systems. Work with streaming data using technologies like Kafka for real-time data processing. Stay updated on the latest data engineering technologies. Guide and mentor junior data engineers. Help create data management rules and procedures. What You'll Need: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 6-8 years of experience as a Data Engineer. Strong skills in Spark and Scala for handling large amounts of data. Good experience with Airflow for managing data workflows and understanding DAGs . Solid understanding of how to transform and prepare data. Strong programming skills in Python for data tasks and automation.. Proven experience working with AWS cloud services (S3, Redshift, EMR, Glue, IAM, EC2, and Athena ). Experience building data solutions for Analytics teams. Familiarity with Node.js for backend development. Experience with Typescript for backend development is a plus. Experience using YAML for configuration management. Hands-on experience with GitHub Actions for automated deployment ( CI/CD ). Good understanding of data warehousing concepts. Strong database skills - OLAP/OLTP Excellent command of SQL for data querying and manipulation. Experience with stream processing using Kafka or similar technologies. Excellent problem-solving, analytical, and communication skills. Ability to work well independently and as part of a team. Bonus Points: Familiarity with data lake technologies (e.g., Delta Lake, Apache Iceberg). Experience with other stream processing technologies (e.g., Flink, Kinesis). Knowledge of data management, data quality, statistics and data governance frameworks. Experience with tools for managing infrastructure as code (e.g., Terraform). Familiarity with container technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana).

Posted 5 days ago

Apply

0.0 - 2.0 years

0 Lacs

Mohali district, Punjab

On-site

Indeed logo

Key Responsibilities: Application Development: Design and develop enterprise applications using the Joget platform, ensuring robust, scalable, and user-friendly solutions. Customization: Customize Joget forms, workflows, plugins, and UI components to meet business requirements. Process Automation: Analyze and implement business process automation workflows, enhancing operational efficiency and reducing manual efforts. Integration: Integrate Joget applications with third-party systems, APIs, and enterprise tools to enable seamless data exchange. Performance Optimization: Optimize Joget applications for performance, scalability, and security. Collaboration: Work closely with business analysts, project managers, and other stakeholders to gather and refine requirements. Testing & Debugging: Conduct thorough testing, troubleshooting, and debugging to ensure application stability and quality. Documentation: Maintain comprehensive technical documentation for all development activities. Mentorship: Provide guidance and mentorship to junior developers as needed. Core Technical Skills: Joget Platform Expertise- Proficiency in Joget Workflow platform for designing and developing forms, workflows, data lists, and user views. Experience in creating and managing custom Joget plugins . Expertise in workflow automation and process configuration. Knowledge of Joget’s built-in components , templates, and modular features. Programming and Development- Strong knowledge of Java for back-end customizations and plugin development. Proficiency in JavaScript , HTML , and CSS for front-end customizations. Experience in SQL for database querying and management. Familiarity with XML and JSON for data handling. Integration and APIs- Hands-on experience integrating Joget applications with third-party systems using REST and SOAP APIs . Knowledge of OAuth , JWT , and other authentication mechanisms for secure integrations. Experience in handling data exchange between Joget and external systems. Database Management- Proficiency in relational databases such as MySQL , PostgreSQL , or Oracle . Experience in writing and optimizing complex SQL queries . Knowledge of database performance tuning and troubleshooting. Deployment and Infrastructure- Familiarity with cloud platforms like AWS, Azure, or Google Cloud for Joget deployment. Experience in Docker or other containerization tools for application hosting. Joget Deployment on Multiple Operating Systems and Databases Knowledge of CI/CD pipelines and deployment automation using tools like Jenkins or GitHub Actions. Debugging and Performance Optimization- Strong skills in troubleshooting Joget applications to identify and resolve issues. Experience in performance optimization of Joget workflows and UI components. Familiarity with Joget’s logging and monitoring tools for system analysis. Security- Understanding of application security best practices , including data encryption, role-based access control, and user authentication. Familiarity with secure coding practices and compliance standards. Job Type: Full-time Pay: ₹500,000.00 - ₹1,000,000.00 per year Benefits: Flexible schedule Health insurance Provident Fund Schedule: Day shift Supplemental Pay: Yearly bonus Ability to commute/relocate: Mohali district, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: joget: 2 years (Required) Work Location: In person

Posted 5 days ago

Apply

Exploring Querying Jobs in India

The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.

Career Path

In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager

Related Skills

Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts

Interview Questions

  • What is the difference between SQL and NoSQL databases? (basic)
  • Explain the purpose of the GROUP BY clause in SQL. (basic)
  • How do you optimize a slow-performing SQL query? (medium)
  • What are the different types of joins in SQL? (medium)
  • Can you explain the concept of ACID properties in database management? (medium)
  • Write a query to find the second-highest salary in a table. (advanced)
  • What is a subquery in SQL? Provide an example. (advanced)
  • Explain the difference between HAVING and WHERE clauses in SQL. (advanced)

Closing Remark

As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies