Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3 - 4 years
4 - 6 Lacs
Gurugram
Work from Office
We are seeking a skilled Database Developer with 3-4 years of experience to design, develop, and optimize databases. The ideal candidate will have expertise in SQL development, database performance tuning, and data modelling. They should also be proficient in working with relational databases like SQL Server, MS SQL, or PostgreSQL and have experience in ETL processes and stored procedures. Key Responsibilities: Database Design & Development Design and develop database schemas, tables, views, and indexes. Write optimized SQL queries, stored procedures, functions, and triggers . Implement database security best practices. Performance Optimization Analyze and improve database query performance . Perform indexing, partitioning, and query optimization techniques. Monitor and troubleshoot database performance issues. ETL & Data Integration Develop and manage ETL processes to migrate and transform data. Work with SSIS, Talend, or other ETL tools for data integration. Database Maintenance & Administration Ensure data integrity, backups, and disaster recovery strategies. Manage database deployments and migrations. Collaborate with DevOps teams to implement CI/CD for databases. Collaboration & Documentation Work closely with developers, data analysts, and business teams . Document database architecture, queries, and procedures. Required Skills & Experience: 3-4 years of experience as a Database Developer . Strong proficiency in SQL Server / MySQL / PostgreSQL . Experience in query optimization and performance tuning . Hands-on experience with stored procedures, triggers, and functions . Knowledge of ETL processes and data warehousing . Experience with database version control and CI/CD for databases . Understanding of NoSQL databases (MongoDB, Radis) is a plus. Familiarity with cloud-based databases (Azure SQL, AWS RDS, GCP Big Query) is a plus.
Posted 1 month ago
6 - 8 years
18 - 22 Lacs
Pune
Work from Office
Lead end-to-end data migration activities Design and implement robust SQL queries and Snowflake-based pipelines Configure and orchestrate migration pipelines Maintain detailed documentation using tools like Jira and Confluence Required Candidate profile 6-8 yrs exp in SQL querying and transformation logic Exp with cloud-based data warehousing in Snowflake Exp with Informatica, Talend, or custom Familiarity with AWS services (S3, Glue, Lambda, etc.)
Posted 1 month ago
5 - 8 years
0 Lacs
Trivandrum, Kerala, India
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc with coding skills in Python PySpark and SQL. Works independently and demonstrates proficiency in at least one domain related to data with a solid understanding of SCD concepts and data warehousing principles. Outcomes Collaborate closely with data analysts data scientists and other stakeholders to ensure data accessibility quality and security across various data sources.rnDesign develop and maintain data pipelines that collect process and transform large volumes of data from various sources.Implement ETL (Extract Transform Load) processes to facilitate efficient data movement and transformation.Integrate data from multiple sources including databases APIs cloud services and third-party data providers.Establish data quality checks and validation procedures to ensure data accuracy completeness and consistency.Develop and manage data storage solutions including relational databases NoSQL databases and data lakes.Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Measures Of Outcomes Adherence to engineering processes and standardsAdherence to schedule / timelinesAdhere to SLAs where applicable# of defects post delivery# of non-compliance issuesReduction of reoccurrence of known defectsQuickly turnaround production bugsCompletion of applicable technical/domain certificationsCompletion of all mandatory training requirementstEfficiency improvements in data pipelines (e.g. reduced resource consumption faster run times).Average time to detect respond to and resolve pipeline failures or data issues. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Documentation Create documentation for personal work and review deliverable documents including source-target mappings test cases and results. Configuration Follow configuration processes diligently. Testing Create and conduct unit tests for data pipelines and transformations to ensure data quality and correctness.Validate the accuracy and performance of data processes. Domain Relevance Develop features and components with a solid understanding of the business problems being addressed for the client.Understand data schemas in relation to domain-specific contexts such as EDI formats. Defect Management Raise fix and retest defects in accordance with project standards. Estimation Estimate time effort and resource dependencies for personal work. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Design Understanding Understand design and low-level design (LLD) and link it to requirements and user stories. Certifications Obtain relevant technology certifications to enhance skills and knowledge. Skill Examples Proficiency in SQL Python or other programming languages utilized for data manipulation.Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).Conduct tests on data pipelines and evaluate results against data quality and performance specifications.Experience in performance tuning data processes.Proficiency in querying data warehouses. Knowledge Examples Knowledge Examples Knowledge of various ETL services provided by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow and Azure ADF/ADLF.Understanding of data warehousing principles and practices.Proficiency in SQL for analytics including windowing functions.Familiarity with data schemas and models.Understanding of domain-related data and its implications. Additional Comments Design, develop, and maintain data pipelines and architectures using Azure services. Collaborate with data scientists and analysts to meet data needs. Optimize data systems for performance and reliability. Monitor and troubleshoot data storage and processing issues. Responsibilities Design, develop, and maintain data pipelines and architectures using Azure services. Collaborate with data scientists and analysts to meet data needs. Optimize data systems for performance and reliability. Monitor and troubleshoot data storage and processing issues. Ensure data security and compliance with company policies. Document data solutions and architecture for future reference. Stay updated with Azure data engineering best practices and tools. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 3+ years of experience in data engineering. Proficiency in Azure Data Factory, Azure SQL Database, and Azure Databricks. Experience with data modeling and ETL processes. Strong understanding of database management and data warehousing concepts. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Skills Azure Data Factory Azure SQL Database Azure Databricks ETL Data Modeling SQL Python Big Data Technologies Data Warehousing Azure DevOps Skills Azure,Aws,Aws Cloud,Azure Cloud
Posted 2 months ago
5 - 8 years
0 Lacs
Chennai, Tamil Nadu, India
Hybrid
Bounteous x Accolite is a premier end-to-end digital transformation consultancy dedicated to partnering with ambitious brands to create digital solutions for today’s complex challenges and tomorrow’s opportunities. With uncompromising standards for technical and domain expertise, we deliver innovative and strategic solutions in Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. Our Co-Innovation methodology is a unique engagement model designed to align interests and accelerate value creation. Our clients worldwide benefit from the skills and expertise of over 4,000+ expert team members across the Americas, APAC, and EMEA. By partnering with leading technology providers, we craft transformative digital experiences that enhance customer engagement and drive business success. About Bounteous ( https://www.bounteous.com/ ) Founded in 2003 in Chicago, Bounteous is a leading digital experience consultancy that co-innovates with the world's most ambitious brands to create transformative digital experiences. With services in Strategy, Experience Design, Technology, Analytics and Insight, and Marketing, Bounteous elevates brand experiences through technology partnerships and drives superior client outcomes. For more information, please visit www.bounteous.com Information Security Responsibilities Promote and enforce awareness of key information security practices, including acceptable use of information assets, malware protection, and password security protocolsIdentify, assess, and report security risks, focusing on how these risks impact the confidentiality, integrity, and availability of information assetsUnderstand and evaluate how data is stored, processed, or transmitted, ensuring compliance with data privacy and protection standards (GDPR, CCPA, etc.)Ensure data protection measures are integrated throughout the information lifecycle to safeguard sensitive information Preferred Qualifications 7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Working knowledge of ETL technology - Talend / Apache Ni-fi / AWS GlueExperience with relational SQL and NoSQL databasesExperience with big data tools: Hadoop, Spark, Kafka, etc. (Nice to have)Advanced Alteryx Designer (Mandatory at this point - relaxing that would be tough)Tableau DashboardingAWS (familiarity with Lambda, EC2, AMI)Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. (Nice to have)Experience with cloud services: EMR, RDS, Redshift or SnowflakeExperience with stream-processing systems: Storm, Spark-Streaming, etc.(Nice to have)Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc. Responsibilities Work with Project Managers, Senior Architects and other team members from Bounteous & Client teams to evaluate data systems and project requirements In cooperation with platform developers, develop scalable and fault-tolerant Extract Transform Load (ETL) and integration systems for various data platforms which can operate at appropriate scale; meeting security, logging, fault tolerance and alerting requirements. Work on Data Migration Projects. Effectively communicate data requirements of various data platforms to team members Evaluate and document existing data ecosystems and platform capabilities Configure CI/CD pipelines Implement proposed architecture and assist in infrastructure setup We invite you to stay connected with us by subscribing to our monthly job openings alert here . Research shows that women and other underrepresented groups apply only if they meet 100% of the criteria of a job posting. If you have passion and intelligence, and possess a technical knack (even if you’re missing some of the above), we encourage you to apply. Bounteous x Accolite is focused on promoting an inclusive environment and is proud to be an equal opportunity employer. We celebrate the different viewpoints and experiences our diverse group of team members bring to Bounteous x Accolite. Bounteous x Accolite does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, physical or mental disability, national origin, veteran status, or any other status protected under federal, state, or local law. In addition, you have the opportunity to participate in several Team Member Networks, sometimes referred to as employee resource groups (ERGs), that host space with individuals with shared identities, interests, and passions. Our Team Member Networks celebrate communities of color, life as a working parent or caregiver, the 2SLGBTQIA+ community, wellbeing, and more. Regardless of your respective identity, there are various avenues we involve team members in the Bounteous x Accolite community. Bounteous x Accolite is willing to sponsor eligible candidates for employment visas.
Posted 4 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2