Jobs
Interviews

1957 Sql Queries Jobs - Page 49

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

We are ooking for future Insighters who can demonstrate teamwork, resuts orientation, a growth mindset, discipined execution, and a winning attitude to join our growing team. We are ooking for a Senior C++ Software Engineer for our software deveopment team. Apart from writing high quaity code, you wi be responsibe for key deiverabes during your teams software deveopment ifecyce incuding software design, code reviews, and comprehensive automated tests. The successfu candidate wi have a passion for continuous improvement and must have exceent written and verba communication skis. You wi aso be writing Java code for connectivity soutions. Responsibiities Deveop maintain and improve software Manage individua project priorities, deadines and deiverabes Contribute improvements to our continuous deivery infrastructure Participate in recruiting and mentoring of top engineering taent Drive roadmap execution and enhance customer feedback into the product Deveop, coaborate on, and execute Agie deveopment, product scenarios, in order to reease high quaity software on a reguar cadence Proactivey assist your team to find and sove deveopment and production software issues through effective coaboration Desirabe - Driving Sprint Panning and breakdown of tasks. Contributing to performance testing & various continuous improvement efforts. Strong OS, Data structure and Agorithms fundamentas Strong hod on object-oriented programming concepts and their impementation through C++. Must have a know-how on JAVA programming. Shoud be abe write optimized & reusabe code. Experience deveoping database technoogies extremey vauabe Experience with ow-eve C and networking is desired, but not necessary. Experience programming database APIs such as ODBC and using database toos strongy preferred Experience using BI toos such Tabeau, Microsoft Power BI and Lumira desirabe but not mandatory. Experience using any memory and performance profiing toos is required Experience working with Agie methodoogy i.e Participating in a team activities, incuding sprint retrospectives, thoughtfu code reviews, knowedge sharing sessions, status reporting for project stakehoders, etc Persona Skis - Strong written and verba communications skis to coaborate deveopers, testers, product owners, scrum masters, directors, and executives Experience taking part in the decision-making process in appication code design, soution deveopment, code review Strong worth ethic and emotiona inteigence incuding being on time for meetings Abiity to work in fast-changing environment and embrace change whie sti foowing a greater pan Quaifications Bacheors degree with minimum 5 years of reated experience or Masters degree with a minimum 5 years of reated experience or Equivaent work experience Experience in CI/CD pipeine Programming experience incuding but not imited to C++ technoogies A good understanding of database concepts (e.g. working with reationa data sources such as MySQL, SQL Server, Orace, etc.) and SQL queries Experience with such products and toos as Bamboo (Atassian), Visua Studio Onine, Visua Studio, and/or Azure is hepfu. Abiity to ead and mentor others Exceent written and verba communication skis. Deveopment experience on a range of operating system patforms such as Windows ( mandatory ) , Linux, and OS X desirabe. Understanding of network interactions: Authentication and authorization fows, standards and practices (e.g. oAuth, JWT) Additiona Information ** At this time insightsoftware is not abe to offer sponsorship to candidates who are not eigibe to work in the country where the position is ocated . ** insightsoftware Hear From Our Team - InsightSoftware (wistia.com) Background checks are required for empoyment with insightsoftware, where permitted by country, state/province.

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Description: Analyze performance bottlenecks in Oracle EBS applications, including concurrent programs, forms, OAF pages, and database-level issues. Conduct load testing, stress testing, and performance benchmarking using tools such as LoadRunner, JMeter, or Oracle Application Testing Suite (OATS). Collaborate with DBA, infrastructure, and application teams to monitor and tune database and application server performance. Identify slow-running SQL queries, perform query tuning, and recommend indexing strategies. Optimize performance of customizations, interfaces (e.g., APIs, concurrent programs), and integrations. Support go-live readiness by executing performance validation and scalability assessments. Define performance SLAs and ensure compliance through test planning and execution. Develop monitoring dashboards and KPIs to proactively detect performance issues in production. Document test strategies, execution reports, tuning recommendations, and performance baselines. Additional Sills:

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Fi Money is a new age money management app designed to simplify your financial life. With Fi, you can save, pay, invest, or borrow, all in one place. You can track and analyze your expenses across Fi and all your other bank accounts. You can also apply for a credit card, access instant loans, and grow your wealth and more with our range of investment options. We re looking for a sharp and curious Business Analyst to join our team. In this role, you ll dive deep into data, uncover insights, and drive decision-making across key business functions. You ll work closely with cross-functional teams to solve high-impact problems with a mix of data analysis, process thinking, and strong execution.We re a team that embraces automation and efficiency - so expect to work with AI tools regularly to optimize workflows and reduce grunt work. What You ll Do: -Write clean, efficient SQL queries to extract and analyze data from large datasets. -Build dashboards, reports, and metrics to track business health and performance. -Partner with Product, Growth, Ops, and Finance to support data-backed decisions. -Automate recurring analysis and reporting using AI and productivity tools. -Drive root cause analysis and opportunity sizing for key business problems. -Present insights and recommendations clearly to both technical and non-technical teams. Must-Have Skills: -1-3 years of experience in a Business/Data Analyst role. -Proficiency in SQL - joins, CTEs, window functions, query optimization. -Strong analytical thinking and comfort working with ambiguity. -Solid understanding of business metrics and how to connect data to decisions. -Comfortable using or learning AI-powered tools to automate and scale your work. This is a 5 days work from office role and we are in Brookfield, Bengaluru.

Posted 1 month ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Senior/Azure Data Engineer Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Overall 8+ Years of experience. At least 5+ years of relevant hands-on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive.

Posted 1 month ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Gurugram

Work from Office

We are seeking skilled Software Engineers to join our team for a high-impact Periodic Review Project . This role involves enhancing and supporting the development of Microsoft Dynamics 365 solutions, with a focus on integrations, customizations, and data processing. Key Responsibilities: Design, develop, and maintain customizations in Dynamics 365 CRM including plugins, custom workflow activities, and JavaScript. Implement and troubleshoot API integrations and Azure Functions . Develop and manage SSIS packages using KingswaySoft for data migration and integration tasks. Write complex SQL queries to resolve system data issues. Integrate Dynamics CRM with legacy and modern systems via batch jobs or APIs. Ensure adherence to best practices and coding standards across the development lifecycle. Analyze, support, and maintain systems/code written by other teams. Collaborate with stakeholders for requirement gathering, testing, and deployment. Required Skills: Strong hands-on experience with Microsoft Dynamics 365 CRM (configuration, customization, plugin/workflow development). Proficient in JavaScript , C# , TypeScript , HTML , CSS , and .NET Framework . Solid experience with KingswaySoft SSIS for CRM data integration. Experience with Azure Functions and Dynamics API integrations. Excellent troubleshooting and problem-solving skills. Ability to work independently and collaboratively in a hybrid work model. Preferred: Immediate joiners. Experience with integration of Dynamics CRM with legacy and modern systems . Familiarity with DevOps pipelines and source control tools (e.g., Azure DevOps, Git).

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Mumbai, Pune, Chennai

Work from Office

Job Category: IT Job Type: Full Time Job Location: Bangalore Chennai Mumbai Pune Location-Mumbai Pune Bangalore Chennai Experience- 5+ We need Azure Data Bricks with QA Must Have SQL queries from test script implementation perspective sql hands-on .test scenarios and QA concepts with respect to Azure stack Azure Data Bricks, Azure Data Factory, Spark SQL knowledge Years - 4-5 years of testing experience in Azure Data Bricks Strong experience in SQL along with performing Azure Data bricks Quality Assurance. Understand complex data system by working closely with engineering and product teams Develop scalable and maintainable applications to extract, transform, and load data in various formats to SQL Server, Hadoop Data Lake or other data storage locations. Kind Note: Please apply or share your resume only if it matches the above criteria

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)

Work from Office

Ability to do exhaustive and detailed testing for Web/ Mobile apps Independent Communicator and ability to handle client calls. Identify what is broken (i.e. find Software Bugs) knowledge of different testing methodologies BFSI Domain preferred. Required Candidate profile Eager to learn, Team player, Good communication, logical and Investigative skills Write/execute test plans and identify software bugs Ability to do ad-hoc/exploratory testing mobile/web applications

Posted 1 month ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Estabish and impement best practices for DBT workfows, ensuring efficiency, reiabiity, and maintainabiity. Coaborate with data anaysts, engineers, and business teams to aign data transformations with business needs. Monitor and troubeshoot data pipeines to ensure accuracy and performance. Work with Azure-based coud technoogies to support data storage, transformation, and processing Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Strong MS SQL, Azure Databricks experience Impement and manage data modes in DBT, data transformation and aignment with business requirements. Ingest raw, unstructured data into structured datasets to coud object store. Utiize DBT to convert raw, unstructured data into structured datasets, enabing efficient anaysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overa performance Preferred technica and professiona experience Estabish best DBT processes to improve performance, scaabiity, and reiabiity. Design, deveop, and maintain scaabe data modes and transformations using DBT in conjunction with Databricks Proven interpersona skis whie contributing to team effort by accompishing reated resuts as required

Posted 1 month ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Pune

Work from Office

As an Associate Software Deveoper at IBM you wi harness the power of data to unvei captivating stories and intricate patterns. You' contribute to data gathering, storage, and both batch and rea-time processing. Coaborating cosey with diverse teams, you' pay an important roe in deciding the most suitabe data management systems and identifying the crucia data required for insightfu anaysis. As a Data Engineer, you' tacke obstaces reated to database integration and untange compex, unstructured data sets. In this roe, your responsibiities may incude: Impementing and vaidating predictive modes as we as creating and maintain statistica modes with a focus on big data, incorporating a variety of statistica and machine earning techniques Designing and impementing various enterprise search appications such as Easticsearch and Spunk for cient requirements Work in an Agie, coaborative environment, partnering with other scientists, engineers, consutants and database administrators of a backgrounds and discipines to bring anaytica rigor and statistica methods to the chaenges of predicting behaviour’s. Buid teams or writing programs to ceanse and integrate data in an efficient and reusabe manner, deveoping predictive or prescriptive modes, and evauating modeing resuts Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise 3 Years + of reevant exp. Strong proficiency in Tabeau Desktop and Tabeau Server.. Experience with SQL and data manipuation. Strong experience in data manipuation and SQL Queries Leads the team to adopt right toos for various migration and modernization method Preferred technica and professiona experience You thrive on teamwork and have exceent verba and written communication skis. Abiity to communicate with interna and externa cients to understand and define business needs, providing anaytica soutions Abiity to communicate resuts to technica and non-technica audiences

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Department: Information Technology Location: APAC-India-IT Delivery Center Hyderabad Description DescriptionEssential Duties and Responsibilities: Develop and maintain data pipelines using Azure native services like ADLS Gen 2, Azure Data Factory, Synapse, Spark, Python, Databricks Develop Datasets require for Business Analytics in Power BI and Azure Data Warehouse, Ensure software development principles, standards, and best practices are followed Maintain existing applications and provide operational support, Review and analyze user requirement and write system specifications Ensure quality design, delivery, and adherence to corporate standards, Participate in daily stand-ups, reviews, design sessions and architectural discussion, Other duties may be assigned Role expectations Role ExpectationsEssential Duties And Responsibilities Develop and maintain data pipelines using Azure native services like ADLS Gen 2, Azure Data Factory, Synapse, Spark, Python, Databricks, Starburst Develop Datasets required for Business Analytics in Power BI and Azure Data Warehouse, Ensure software development principles, standards, and best practices are followed Maintain existing applications and provide operational support, Review and analyze user requirement and write system specifications Ensure quality design, delivery, and adherence to corporate standards, Participate in daily stand-ups, reviews, design sessions and architectural discussion, Other duties may be assigned What We're Looking For What we're looking for Required Qualifications And Skills 7+yrs Experience in solution delivery for Data Analytics to get insights for various departments in Organization, 5+yrs Experience in delivering solutions using Microsoft Azure Platform or AWS Services with emphasis on data solutions and services, Extensive knowledge on writing SQL queries and experience in performance tuning queries Experience developing software architectures and key software components Proficient in one or more of the following programming languages: C#, Java, Python, Scala, and related open-source frameworks, Understanding of data services including Azure SQL Database, Data Lake, Databricks, Data Factory, Synapse, Kafka, streaming Data modeling experience on Azure DW/ AWS , understanding of dimensional model, star schemas, data vaults Quick learner who is passionate about new technologies, Strong sense of ownership, customer obsession, and drive with a can-do attitude, Team player with great communication skills--listening, speaking, reading, and writing--in English BS in Computer Science, Computer Engineering, or other quantitative fields such as Statistics, Mathematics, Physics, or Engineering, Applicant Privacy Policy Review our Applicant Privacy Policyfor additional information, Equal Opportunity Statement Align Technology is an equal opportunity employer We are committed to providing equal employment opportunities in all our practices, without regard to race, color, religion, sex, national origin, ancestry, marital status, protected veteran status, age, disability, sexual orientation, gender identity or expression, or any other legally protected category Applicants must be legally authorized to work in the country for which they are applying, and employment eligibility will be verified as a condition of hire,

Posted 1 month ago

Apply

3.0 - 7.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Who we are About Stripe Stripe is a financial infrastructure platform for businesses Millions of companies from the worlds largest enterprises to the most ambitious startups use Stripe to accept payments, grow their revenue, and accelerate new business opportunities Our mission is to increase the GDP of the internet, and we have a staggering amount of work ahead That means you have an unprecedented opportunity to put the global economy within everyone's reach while doing the most important work of your career, About The Team Did you know that only around 4% of the worlds GDP comes from internet commerceAt Stripe, we believe that this represents a future with almost limitless potential for innovation, creativity and global prosperity While the promise of a global online economy is palpable, it doesnt come without significant risk Each day, bad actors disrupt the trust and safety of the internet and increase the barrier of entry for online businesses The Risk team ensures that our platform remains safe, and that prohibited parties are not allowed to utilize our services, What youll do The Risk Ops Quality team is looking for an experienced Quality analyst to conduct quality reviews and oversee the day-to-day quality assurance of the Risk operational team The right person for this role has an eye for detail, is obsessed with quality processes and outcomes, and values efficiency The Quality Insights analyst will be joining a highly collaborative global team that supports the broader global Risk department at Stripe In this position, you will be responsible for conducting quality audits and providing oversight and maintenance of the process to measure and improve the quality of our Risk vertical A successful Quality analyst will have a concrete understanding of the online Risk landscape in the payments/FinTech space, a forensic ability to detect root causes of human error, and an innovative mindset to recommend process improvements that eliminate errors, The right candidate for this role will have experience in financial crimes compliance, preferably within the fintech or e-commerce space, Responsibilities Conduct quality evaluations of Risk [Fraud Operations] review cases Oversee and refine Quality evaluation plans in accordance with applicable policies and procedures for Risk program at Stripe Be responsible for drafting and circulating detailed monthly QA reports outlining the QA scores and trends observed, clearly articulating any issues noted and remedial actions to be taken Work cross-functionally with senior Risk stakeholders to formulate remediation action items and ensure timely compliance with the associated action plans Draft regular reports for management, highlighting the results and recommendations for mitigations Maintain clear and organized documentation in relation to how Risk program [Fraud Ops] quality assurance is structured, including sampling logic and coverage of the QA oversight Maintain a critical eye on the effectiveness of the Fraud Operations team, and drive experiments on and improvements to the quality oversight program in order to increase confidence in the Quality evaluations Who you are Were looking for someone who meets the minimum requirements to be considered for the role If you meet these requirements, you are encouraged to apply The preferred qualifications are a bonus, not a requirement, Minimum Requirements Minimum 4 years of experience across quality assurance and/or controls testing within a Risk environment Quality metrics development experience Ability to manage multiple projects simultaneously with minimal supervision Excellent analytical and communication skills Experience working in a fast-moving operational environment and navigating ambiguity Preferred Qualifications Knowledge in payments and the fintech industry Familiarity with SQL and ability to use template or premade SQL queries as part of investigations In-office expectations Office-assigned Stripes in most of our locations are currently expected to spend at least 50% of the time in a given month in their local office or with users This expectation may vary depending on role, team and location For example, Stripes in Stripe Delivery Center roles in Mexico City, Mexico and Bengaluru, India work 100% from the office Also, some teams have greater in-office attendance requirements, to appropriately support our users and workflows, which the hiring manager will discuss This approach helps strike a balance between bringing people together for in-person collaboration and learning from each other, while supporting flexibility when possible, Pay and benefits Stripe does not yet include pay ranges in job postings in every country Stripe strongly values pay transparency and is working toward pay transparency globally,

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Pune

Work from Office

What You'll Do We are seeking a highly skilled and motivated Senior Data Engineer to join our Data Operations team The ideal candidate will have deep expertise in Python, Snowflake SQL, modern ETL tools, and business intelligence platforms such as Power BI This role also requires experience integrating SaaS applications such as Salesforce, Zuora, and NetSuite using REST APIs You will be responsible for building and maintaining data pipelines, developing robust data models, and ensuring seamless data integrations that support business analytics and reporting The role requires flexibility to collaborate in US time zones as needed, What Your Responsibilities Will Be Design, develop, and maintain scalable data pipelines and workflows using modern ETL tools and Python, Build and optimize SQL queries and data models on Snowflake to support analytics and reporting needs, Integrate with SaaS platforms such as Salesforce, Zuora, and NetSuite using APIs or native connectors, Develop and support dashboards and reports using Power BI and other reporting tools, Work closely with data analysts, business users, and other engineering teams to gather requirements and deliver high-quality solutions, Ensure data quality, accuracy, and consistency across systems and datasets, Write clean, well-documented, and testable code with a focus on performance and reliability, Participate in peer code reviews and contribute to best practices in data engineering, Be available for meetings and collaboration in US time zones as required, What Youll Need To Be Successful You should have 5+ years' experience in data engineering field, with deep SQL knowledge, Strong experience in Snowflake SQL, Python, AWS Services, Power BI, ETL Tools (DBT, Airflow) is must, Proficiency in Python for data transformation and scripting, Proficiency in writing complex SQL queries, Stored Procedures, Strong experience in Data Warehouse, data modeling and ETL design concepts, Should have integrated SaaS systems like Salesforce, Zuora, NetSuite along with Relational Databases, REST API, FTP/SFTP, etc Knowledge of AWS technologies (EC2, S3, RDS, Redshift, etc ) Excellent communication skills, with the ability to translate technical issues for non-technical stakeholders, Flexibility to work during US business hours as required for team meetings and collaboration, How Well Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses, Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance, Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship, What You Need To Know About Avalara Were Avalara Were defining the relationship between tax and tech, Weve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business Our growth is real, and were not slowing down until weve achieved our mission to be part of every transaction in the world, Were bright, innovative, and disruptive, like the orange we love to wear It captures our quirky spirit and optimistic mindset It shows off the culture weve designed, that empowers our people to win Ownership and achievement go hand in hand here We instill passion in our people through the trust we place in them, Weve been different from day one Join us, and your career will be too, Were An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company ? we dont want people to fit into our culture, but to enrich it All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law If you require any reasonable adjustments during the recruitment process, please let us know,

Posted 1 month ago

Apply

2.0 - 5.0 years

1 - 5 Lacs

Pune

Work from Office

Job Description: We are seeking a skilled individual to join our team as a Alfresco Developer & Support Specialist In this role, you will be responsible for developing, implementing, and supporting Document Archive and Storage solutions using Alfresco to meet the organization's document capture and data extraction needs, Responsibilities: Develop, customize, and configure Document Archive and Storage solutions using Alfresco according to business requirements, Design and implement document storage and ingestion processes, including document classification, indexing, and ingestion, Collaborate with stakeholders to understand their needs and provide technical expertise in designing efficient and effective solutions, Provide ongoing support and maintenance for existing Alfresco sites and integration interfaces, Troubleshoot and resolve issues related to document storage, ingestion, and integration with other systems, Conduct testing and quality assurance activities to ensure the reliability and performance of Alfresco solutions, Keep abreast of industry trends and best practices related to document archive and storage technologies such as Global solution ADSR, Requirements: Bachelor's degree in Computer Science, Information Technology, or related field, Proven experience in developing and supporting Document Archive and Storage solutions, Strong understanding of document archive and storage concepts and technologies, Proficiency in programming languages such as C#, VBDot net, or Visual Basic, Familiarity with databases and SQL queries, Excellent problem-solving and troubleshooting skills, Ability to work independently and collaboratively in a team environment, Effective communication skills with the ability to interact with both technical and non-technical stakeholders, Preferred Qualifications: Alfresco Content Service Engineer and/or Administrator certifications, Experience in the financial services domain with good knowledge of insurance business, Knowledge of enterprise content management systems such as SharePoint or ADSR, Working experience with Github, CI/CD automation tools such as Jenkins / Ansible, Working experience with web services integration and API development, Allianz Group is one of the most trusted insurance and asset management companies in the world Caring for our employees, their ambitions, dreams and challenges, is what makes us a unique employer Together we can build an environment where everyone feels empowered and has the confidence to explore, to grow and to shape a better future for our customers and the world around us, We at Allianz believe in a diverse and inclusive workforce and are proud to be an equal opportunity employer We encourage you to bring your whole self to work, no matter where you are from, what you look like, who you love or what you believe in, We therefore welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation, Join us Let's care for tomorrow,

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 15 Lacs

Gurugram

Work from Office

Design, construct, and maintain scalable data management systems using Azure Databricks, ensuring they meet end-user expectations. Supervise the upkeep of existing data infrastructure workflows to ensure continuous service delivery. Create data processing pipelines utilizing Databricks Notebooks, Spark SQL, Python and other Databricks tools. Oversee and lead the module through planning, estimation, implementation, monitoring and tracking. Desired Skills and Experience Over 8 + years of experience in data engineering, with expertise in Azure Data Bricks, MSSQL, LakeFlow, Python and supporting Azure technology. Design, build, test, and maintain highly scalable data management systems using Azure Databricks. Create data processing pipelines utilizing Databricks Notebooks, Spark SQL. Integrate Azure Databricks with other Azure services like Azure Data Lake Storage, Azure SQL Data Warehouse. Design and implement robust ETL pipelines using ADF and Databricks, ensuring data quality and integrity. Collaborate with data architects to implement effective data models and schemas within the Databricks environment. Develop and optimize PySpark/Python code for data processing tasks. Assist stakeholders with data-related technical issues and support their data infrastructure needs. Develop and maintain documentation for data pipeline architecture, development processes, and data governance. Data Warehousing: In-depth knowledge of data warehousing concepts, architecture, and implementation, including experience with various data warehouse platforms. Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients. Excellent problem-solving skills, with ability to work independently or as part of team. Strong communication and interpersonal skills, with ability to effectively engage with both technical and non-technical stakeholders. Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts. Key Responsibilities Interpret business requirements, either gathered or acquired. Work with internal resources as well as application vendors Designing, developing, and maintaining Data Bricks Solution and Relevant Data Quality rules Troubleshooting and resolving data related issues. Configuring and Creating Data models and Data Quality Rules to meet the needs of the customers. Hands on in handling Multiple Database platforms. Like Microsoft SQL Server, Oracle etc Reviewing and analyzing data from multiple internal and external sources Analyse existing PySpark/Python code and identify areas for optimization. Write new optimized SQL queries or Python Scripts to improve performance and reduce run time. Identify opportunities for efficiencies and innovative approaches to completing scope of work. Write clean, efficient, and well-documented code that adheres to best practices and Council IT coding standards. Maintenance and operation of existing custom codes processes Participate in team problem solving efforts and offer ideas to solve client issues. Query writing skills with ability to understand and implement changes to SQL functions and stored procedures. Effectively communicate with business and technology partners, peers and stakeholders Ability to deliver results under demanding timelines to real-world business problems. Ability to work independently and multi-task effectively. Configure system settings and options and execute unit/integration testing. Develop end-user Release Notes, training materials and deliver training to a broad user base. Identify and communicate areas for improvement Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic. Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 1 month ago

Apply

6.0 - 8.0 years

12 - 17 Lacs

Gurugram

Work from Office

As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures. Desired Skills and Experience Essential skills A bachelors degree in computer science, engineering, mathematics, or statistics 6-8 years of experience in a Data Engineering role, with a proven track record of delivering insightful and value add dashboards Experience writing Advanced SQL queries, Python and a deep understanding of relational databases Experience working within an Azure environment Experience with Tableau, Holland Mountain ATLAS is a plus. Experience with master data management and data governance is a plus. Ability to prioritize multiple projects simultaneously, problem solve, and think outside the box Key Responsibilities Develop, test and release Data packages for Tableau Dashboards to support all business functions, including investments, investor relations, marketing and operations Support ad hoc requests, including the ability to write queries and extract data from a data warehouse Assist with the management and maintenance of an Azure environment Maintain a data dictionary, which includes documentation of database structures, ETL processes and reporting dependencies Key Metrics Python, SQL Data Engineering, Azure and ATLAS Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 1 month ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: Masters degree in computer science, Information Technology, or a related field. At least 3+ years of proven experience as a Snowflake Developer and minimum 8+ years of total experience with data modelling (OLAP & OLTP). Extensive hands-on experience in writing complex SQL queries and advanced Python, demonstrating proficiency in data manipulation and analysis for large data volumes. Strong understanding of data warehousing concepts, methodologies, and technologies with in-depth experience in data modelling techniques (OLTP, OLAP, Data Vault 2.0) Experience building data pipelines using DBT (Data Build Tool) for data transformation. Familiarity with advanced techniques for performance tuning methodologies in Snowflake including query optimization. Strong knowledge with CI/CD pipelines, preferably in Azure DevOps. Excellent problem-solving, analytical, and critical thinking skills. Strong communication, collaboration, and interpersonal skills. Knowledge of additional data technologies (e.g., AWS, Azure, GCP) is a plus. Knowledge of Infrastructure as Code (IAC) tools such as Terraform or cloud formation is a plus. Experience in leading projects or mentoring junior developers is advantageous.

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 5 Lacs

Chandigarh, Panchkula

Work from Office

Contact Details : 9354073534 •Collaborate with cross-functional teams to identify data requirements &develop analytics solutions that meet business needs • Good knowledge of SQL Database • Knowledge of SDLC,STLC,Defect life cycle

Posted 1 month ago

Apply

1.0 - 2.0 years

1 - 2 Lacs

Erode

Work from Office

Permeant Role: Junior Developer Skills: Sql ,sql server ,database, sql Queries ,Tsql,mysql Work Mode :WFO Contact:7397076469

Posted 1 month ago

Apply

4.0 - 9.0 years

11 - 15 Lacs

Gurugram

Work from Office

Execute end-to-end marketing campaigns including setup, scheduling, rollout, reporting. Manage and optimize campaigns using platforms like Oracle Responsys, Unity, and Infinity IQ Define audience segments and target groups using complex SQL queries Required Candidate profile Monitor and optimize email campaign performance Analyze data from multiple sources Build reports & dashboards Implement hyper-personalization strategies and A/B testing frameworks. Perks and benefits Perks and Benefits

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 9 Lacs

Hyderabad

Hybrid

Excellent SQL and T-SQL skills Excellent knowledge of MS SQL Server Solid knowledge of Relational Database design Knowledge of SQL Server Management Studio Good knowledge of classic ASP, server-side VBScript, ADO Strong JavaScript and jQuery Good knowledge and experience in working with AJAX Knowledge and experience in using raw HTML for building dynamic, interactive web pages Strong knowledge and experience working with MS Excel

Posted 1 month ago

Apply

7.0 - 10.0 years

9 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Candidate Requirement 7 years of experience in PL/SQL development. Proficiency in writing complex SQL queries and stored procedures. Experience with performance tuning and optimization techniques. Strong knowledge on Datawarehouse concepts. Good development experience on ETL tool like Informatica. Should be good at SQL queries and PL/SQL concepts. Knowledge on Unix shell scripting and python is added advantage. Experience working on scheduling tools like Autosys. Experience working in agile model. Strong analytical and problem-solving skills.. Hands on experience on IICS is added advantage. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

7.0 - 11.0 years

10 - 15 Lacs

Mumbai, Bengaluru, Delhi / NCR

Work from Office

Must Have Skills 7+ years of strong functional & technical experience on Manhattan WM Active Platform Expertise in Unix/Linux scripting Proficiency in writing Oracle SQL queries Experience with implementation and rollout projects Familiarity with Incident Management and Change Implementation processes Excellent troubleshooting and analytical skills Clear and confident verbal and written communication skills Experience in Retail Domain or Warehouse Management (WM) application support is a plus Responsibilities Create and enhance existing applications Provide production support for Manhattan WMS Active Design & develop technical solutions in collaboration with the Application Delivery team Build custom objects (queries, scripts, etc.) to fulfill business needs Define, create, and execute system and unit test plans Take ownership of complex technical challenges, with regular status updates Lead planning and execution of medium complexity tasks and initiatives Work closely with the onshore team to ensure timely delivery of assigned work Good To Have Excellent Communication and Interpersonal Skills Ability to work Independently and in a Team Environment Strong Analytical Thinking and Problem Solving abilities Resume Submission Checklist Please include the following when sharing your profile: Current CTC Expected CTC Current Location Notice Period Immediate or Max 15 Days Only Send Resumes To: navaneetha@suzva.com Contact: +91 90329 56160 Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 1 month ago

Apply

2.0 - 6.0 years

8 - 12 Lacs

Noida

Work from Office

Primary Responsibilities: Writing clean, maintainable and testable code to develop software features Reproduce bugs, investigate root cause and develop fix Review code and offer technical support to fellow team members as needed Communicate effectively with technical and non-technical stakeholders Work closely with various teams that own Systems of Records or Sources of Truth to query, analyze, sanitize, and ingest their data into our knowledge graph Think critically to analyze data and gather insights that lead to high-value decisions that improve the security posture across the enterprise Stay updated with the latest technologies and industry trends to continuously improve the teams capabilities. Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree in Computer Science, Engineering, or a related field, or equivalent experience Hands-on experience authoring highly performance SQL queries, working with a wide variety of databases including RDBMS as well as NoSQL databases Hands-on experience authoring scalable, high-performance APIs (REST) Hands-on experience with automated testing Experience with engineering projects hosted in public cloud AWS, Azure or GCP Solid proficiency in TypeScript Good understanding of CI/CD pipelines Ability to deal with ambiguity, changing and often conflicting priorities, to plan and execute while balancing timeliness, quality of deliverables Proven excellent problem-solving skills and a proactive, go-getter attitude Ability to work collaboratively with globally distributed teams in a fast-paced agile development environment and manage time effectively Preferred Qualifications: Experience with Apollo Server, GraphQL Experience with graph databases and Meilisearch or Elasticsearch Experience with Containers and Kubernetes Understanding of Network Security in cloud and on-prem hosting environments Points to note: Flexible to work in and overlap with teams in India as well as teams based on US (between 7 AM Central Time 11 AM Central Time) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone of every race, gender, sexuality, age, location and income deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission. #njp ##SSTech

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Ahmedabad

Work from Office

Job Description: We are seeking a highly skilled Lead Python Developer with 6 to 8 years of hands-on experience in backend development using Django . The ideal candidate will have a strong background in building and maintaining RESTful APIs , implementing CI/CD pipelines , and working with PostgreSQL databases. Experience with Celery for parallel task processing is essential. Key Responsibilities: Design, develop, and maintain scalable backend systems using Django. Build and integrate RESTful APIs for web and mobile applications. Implement and manage CI/CD pipelines for automated testing and deployment. Optimize and manage PostgreSQL databases for performance and reliability. Use Celery for asynchronous task processing and job queues. Collaborate with frontend developers, DevOps, and QA teams. Write clean, maintainable, and well-documented code. Participate in code reviews and mentor junior developers. Required Skills: 6-8 years of professional experience in Python and Django. Strong experience with REST API development. Proficiency in PostgreSQL and writing optimized SQL queries. Hands-on experience with Celery and message brokers (e.g., RabbitMQ, Redis). Experience with CI/CD tools (e.g., Jenkins, GitLab CI, GitHub Actions). Familiarity with Docker and containerized environments. Good understanding of software development best practices and design patterns. Excellent problem-solving and communication skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 13 Lacs

Hyderabad

Work from Office

SnowFlake Data Engineering (SnowFlake, DBT & ADF) - Lead Programmer Analyst (Experience: 5 to 8 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) - Lead Programmer Analyst: At least 5 years of experience in designing and developing Data Pipelines & Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof. Hands on programmer with a thorough understand of performance tuning techniques. Handling large data volume transformations (order of 100 GBs monthly). Able to create solution / data flows to suit requirements. Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter & learner. Able to understand and probe for requirements. Tech experience expected. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL or any data processing tool. Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies