Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a skilled and proactive SQL Server Database Administrator (DBA) with deep expertise in SQL Server infrastructure, you will be responsible for managing, maintaining, and optimizing enterprise-level database environments. Your role will involve working closely with cross-functional teams to ensure high performance, reliability, and data integrity across all SQL Server instances. **Key Responsibilities:** - Install, configure, upgrade, and maintain Microsoft SQL Server databases in development, testing, and production environments - Monitor database performance and proactively implement tuning and optimization strategies (indexing, query optimization, etc.) - Design and implement robust backup, recovery, high availability (HA), and disaster recovery (DR) solutions - Ensure database security through proper access controls, encryption, and compliance with company policies and best practices - Collaborate with development and DevOps teams to design efficient database schemas, stored procedures, and SQL queries - Troubleshoot and resolve database-related incidents and provide root cause analysis - Maintain and update comprehensive documentation of database configurations, processes, and procedures - Stay current with SQL Server updates, patches, and industry trends to recommend and implement improvements **Qualifications:** - Proven experience as a SQL Server DBA (typically 3+ years) - Strong knowledge of SQL Server architecture, installation, and configuration - Proficiency in performance tuning and optimization (using DMVs, Profiler, Extended Events, etc.) - Experience with backup and recovery tools, replication, log shipping, Always On Availability Groups - Hands-on experience with SQL Server versions - Solid understanding of T-SQL and ability to write and optimize complex queries - Familiarity with cloud-based SQL Server solutions (e.g., Azure SQL Database) is a plus - Excellent troubleshooting and problem-solving skills - Strong communication and documentation abilities You will be working in a hybrid mode in locations like Hyderabad, Bangalore, or Pune for a contract period of 12 months. The company requires a B.Tech graduate in any branch or specialization for this role.,
Posted 2 days ago
6.0 - 8.0 years
6 - 16 Lacs
bangalore rural, bengaluru
Hybrid
Primary skills:ETL/ELT pipelines using DBT and AWS Redshift. Secondary skills :Proficiency in SQL and scripting languages like Python or Shell
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
jaipur, rajasthan
On-site
As a Data Engineer with Fabric, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure to ensure accurate, timely, and accessible data for driving data-driven decision-making and supporting company growth. Key Responsibilities: - Design, develop, and implement data pipelines using Azure Data Factory and Databricks for ingestion, transformation, and movement of data. - Develop and optimize ETL processes to ensure efficient data flow and transformation. - Maintain Azure Data Lake solutions for efficient storage and retrieval of large datasets. - Build and manage scalable data warehousing solutions using Azure Synapse Analytics for advanced analytics and reporting. - Integrate various data sources into MS-Fabric to ensure data consistency, quality, and accessibility. - Optimize data processing workflows and storage solutions to improve performance and reduce costs. - Manage and optimize SQL and NoSQL databases to support high-performance queries and data storage requirements. - Implement data quality checks and monitoring to ensure accuracy and consistency of data. - Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver actionable insights. - Create and maintain comprehensive documentation for data processes, pipelines, infrastructure, architecture, and best practices. - Identify and resolve issues in data pipelines, data lakes, and warehousing solutions, providing timely support and maintenance. Qualifications: - Experience: 2-4 years of experience in data engineering or a related field. - Technical Skills: - Proficiency with Azure Data Factory, Azure Synapse Analytics, Databricks, and Azure Data Lake. - Experience with Microsoft Fabric is a plus. - Strong SQL skills and experience with data warehousing concepts (DWH). - Knowledge of data modeling, ETL processes, and data integration. - Hands-on experience with ETL tools and frameworks (e.g., Apache Airflow, Talend). - Knowledge of big data technologies (e.g., Hadoop, Spark) is a plus. - Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and associated data services (e.g., S3, Redshift, BigQuery). - Familiarity with data visualization tools (e.g., Power BI) and experience with programming languages such as Python, Java, or Scala. - Experience with schema design and dimensional data modeling. - Analytical Skills: Strong problem-solving abilities and attention to detail. - Communication: Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders. - Education: Bachelor's degree in computer science, Engineering, Mathematics, or a related field. Advanced degrees or certifications are a plus. Interested candidates can share their CV at sulabh.tailang@celebaltech.com.,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
jaipur, rajasthan
On-site
As a Data Engineer at our company, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure to ensure accurate, timely, and accessible data for our organization's growth and data-driven decision-making. Key Responsibilities: - Design, develop, and implement data pipelines using Azure Data Factory and Databricks for data ingestion, transformation, and movement. - Develop and optimize ETL processes to facilitate efficient data flow and transformation. - Maintain Azure Data Lake solutions for efficient storage and retrieval of large datasets. - Collaborate with Azure Synapse Analytics to build scalable data warehousing solutions for advanced analytics and reporting. - Integrate various data sources into MS-Fabric, ensuring data consistency, quality, and accessibility. - Optimize data processing workflows and storage solutions to enhance performance and reduce costs. - Manage and optimize SQL and NoSQL databases for high-performance queries and data storage. - Implement data quality checks and monitoring processes to ensure data accuracy and consistency. - Work closely with data scientists, analysts, and stakeholders to understand data requirements and deliver actionable insights. - Create and maintain comprehensive documentation for data processes, pipelines, infrastructure, architecture, and best practices. - Identify and resolve issues in data pipelines, data lakes, and warehousing solutions, providing timely support and maintenance. Qualifications: - 2-4 years of experience in data engineering or a related field. Technical Skills: - Proficiency in Azure Data Factory, Azure Synapse Analytics, Databricks, and Azure Data Lake. - Experience with Microsoft Fabric is a plus. - Strong SQL skills and familiarity with data warehousing concepts (DWH). - Knowledge of data modeling, ETL processes, and data integration. - Hands-on experience with ETL tools and frameworks like Apache Airflow and Talend. - Familiarity with big data technologies such as Hadoop and Spark. - Experience with cloud platforms like AWS, Azure, Google Cloud, and associated data services. - Familiarity with data visualization tools like Power BI and programming languages such as Python, Java, or Scala. - Experience with schema design and dimensional data modeling. Analytical Skills: - Strong problem-solving abilities and attention to detail. Communication: - Excellent verbal and written communication skills to explain technical concepts to non-technical stakeholders. Education: - Bachelor's degree in computer science, engineering, mathematics, or a related field. Advanced degrees or certifications are a plus. Interested candidates can share their CV at sulabh.tailang@celebaltech.com.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
As a SAS Developer based in Sydney & Melbourne, your role will involve writing SAS scripts to translate business logic into SAS datasets and create dashboards. You should have sound knowledge in SAS Base, SAS SQL, SAS/Access, SAS/Macros, SAS/Graph, SAS/ODS, SAS BI, and data transformation (ETL) along with reporting logic. Your responsibilities will include: - Implementing change requests in both new and existing scripts and dashboards - Optimizing processing time for existing and new dashboards - Documenting schema design and its metadata repositories - Collaborating with the business team to understand data warehouse principles and convert mapping documents into new ETL processes and data visualizations Preferred qualifications for this role include familiarity with SAS Admin and AWS Cloud.,
Posted 5 days ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
We are seeking an experienced Program Manager to take charge of delivering large-scale, data-driven solutions for a prominent global technology client. In this role, you will be responsible for overseeing technical project delivery, managing engineering teams, and actively participating in data extraction, transformation, and visualization processes. The ideal candidate should possess strong program management skills, a solid technical background, and the ability to drive efficiency and quality across teams. With over 10 years of experience in leading complex technical programs with multiple workstreams, you should have hands-on expertise in Python and data engineering processes. You must also demonstrate in-depth knowledge of ETL systems, data extraction, and transformation techniques, along with proficiency in SQL for building reports and dashboards. Additionally, expertise in data modeling, schema design, relational databases, familiarity with Linux environments, and a strong background in program management are essential for this role. Your responsibilities will include leading and managing cross-functional teams to deliver on the product roadmap, extracting and ingesting data using Python, designing ETL pipelines, monitoring data accuracy, developing and maintaining data models, building SQL-based reports and dashboards, troubleshooting pipeline issues, automating workflows, ensuring service stability, defining software development best practices, collaborating with stakeholders, and driving documentation standards. At GlobalLogic, we offer a culture of caring that prioritizes putting people first, a commitment to continuous learning and development, opportunities for interesting and meaningful work, balance and flexibility, and a high-trust organization where integrity is paramount. Join us and become part of a digital engineering partner that has been at the forefront of the digital revolution since 2000, collaborating with clients to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
MongoDB's mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. MongoDB enables organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. The industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. With the ability to build and run applications anywhere, MongoDB Atlas allows customers to operate on-premises or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, leading organizations such as Samsung and Toyota trust MongoDB to build next-generation, AI-powered applications. At the core of MongoDB's mission is Application Modernization, where enterprises are assisted in transforming legacy, monolithic systems into modern, cloud-native, AI-powered applications. Many organizations globally are burdened by outdated applications lacking scalability, resilience, and cloud compatibility. The Application Modernization team guides customers through this transformation journey by migrating from relational systems to MongoDB Atlas and expediting the process using AI-driven code translation, refactoring, and database migration tools. We are seeking a Director of Engineering, Application Modernization to lead this transformation at scale. This role presents a unique opportunity to oversee a global engineering organization at the forefront of modernization and Generative AI. The Director of Engineering will be responsible for: - Leading and Inspiring: Driving multiple globally distributed engineering teams to foster a culture of innovation, accountability, and growth - Setting Vision: Defining and executing the engineering strategy for the modernization platform, aligning with business outcomes and advancing AI-driven modernization - Owning the Platform: Taking full ownership of the Application Modernization platform design, delivery, and operational excellence of customer-facing features - Accelerating with AI: Advocating for the integration of AI-powered code translation, refactoring, and migration tooling to make MongoDB the preferred modernization path for enterprises - Modernizing Legacy Stacks: Leading the transformation of legacy relational and monolithic applications into modern, distributed, cloud-native architectures built on MongoDB Atlas - Raising the Bar: Establishing best practices in distributed systems design, database migrations, performance optimization, and operational excellence - Database Expertise: Applying deep knowledge of relational (Oracle, SQL Server, Postgres, MySQL) and non-relational/document systems to assist customers in transitioning to a modern data stack - Talent Building: Hiring, mentoring, and developing world-class engineering leaders and ICs, ensuring career progression and strong technical vision - Cross-Functional Influence: Collaborating with Product, Field Engineering, Data Science, Sales, and executives to align priorities and maximize customer impact - Innovating and Experimenting: Driving proof-of-concepts with emerging AI/ML technologies and strategic technology partnerships - Being a Force Multiplier: Removing blockers, amplifying team impact, and creating an environment where engineers can excel in their careers The ideal candidate will have: - Distributed Systems Expertise: Proven track record of building and scaling large-scale, distributed systems with strong performance, resilience, and reliability requirements - Database Depth: Hands-on expertise with relational and non-relational systems, with practical knowledge of schema design, database migrations, and performance tuning - Enterprise Scale Leadership: 10+ years of experience leading engineering teams (including leaders-of-leaders) across globally distributed locations - AI Mindset: Experience or strong passion for building AI-powered applications and developer tools To support the personal growth and business impact of employees, MongoDB is committed to fostering a supportive and enriching culture. Employee affinity groups, fertility assistance, and a generous parental leave policy are among the initiatives aimed at valuing employees" wellbeing and providing support throughout their professional and personal journeys. MongoDB is dedicated to accommodating individuals with disabilities throughout the application and interview process. For any necessary accommodations due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
kochi, kerala
On-site
As a Database Administrator, your primary responsibility will be to ensure the uptime, reliability, and performance of PostgreSQL databases. You will design and implement procedures for backup, failover, and recovery to maintain data integrity and business continuity. Additionally, you will provide hands-on expertise to development teams on schema design, indexing, and query optimization. Proactively monitoring database health and performance to identify and resolve issues before they impact production will be crucial. Collaborating with internal teams and service providers, such as AWS and Enterprise DB, will be essential for issue resolution. Moreover, working closely with DevOps and Engineering to integrate safe database change processes into delivery pipelines is a key aspect of this role. Establishing and documenting database standards, policies, and best practices, as well as contributing to the broader data architecture strategy, are significant responsibilities as the organization scales and evolves. You will recommend and implement best practices for data security, compliance, and scalability while defining, agreeing, and maintaining an improvement roadmap for the database estate. To be successful in this role, you should have at least 6 years of hands-on experience working with PostgreSQL in complex, production environments. Demonstrable expertise in operations, such as backup, point-in-time recovery, replication, and failover, is required. Deep technical knowledge of PostgreSQL internals, including query optimization, indexing strategies, and performance tuning, is essential. Experience with cloud-based and/or containerized infrastructure, scripting for automation, Linux system administration, and strong communication skills are also necessary. Desirable skills include exposure to other database technologies like MySQL, MongoDB, and Redis, experience with observability and monitoring tools such as Prometheus and Grafana, and familiarity with infrastructure-as-code techniques like Terraform and Ansible.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
As a highly skilled and experienced NoSQL Database Architect with over 4 years of expertise in DynamoDB, you will play a crucial role in our team by designing, implementing, and optimizing our database infrastructure. Your main responsibilities will include ensuring performance, scalability, and compliance with data sovereignty requirements. Your deep knowledge of DynamoDB, especially in global table configurations, and your proven experience in migrating large-scale relational databases to NoSQL will be essential for excelling in this role. You will be tasked with designing and implementing DynamoDB database solutions, focusing on global table configurations to meet Australian data sovereignty regulations. Additionally, you will optimize Global Secondary Index (GSI) designs for affiliate performance analytics, guaranteeing single-digit millisecond query performance. Leading the migration of petabyte-scale data from relational systems such as PostgreSQL to NoSQL (DynamoDB) will also be a critical aspect of your role. Furthermore, you will be expected to develop and enforce database best practices and standards, monitor and optimize database performance and scalability, collaborate with development teams to design efficient data models and access patterns, troubleshoot and resolve database issues, and stay up-to-date with the latest DynamoDB features and best practices. To excel in this position, you should possess deep expertise in DynamoDB, including global table configurations, GSI design, and performance tuning. Proven experience in migrating large-scale relational databases to NoSQL databases, specifically at petabyte scale, is crucial. A strong understanding of NoSQL database principles and best practices, experience in data modeling and schema design for NoSQL databases, excellent problem-solving and analytical skills, and strong communication and collaboration abilities are also required. Additionally, holding an AWS Certified Database Specialty certification is a prerequisite for this role. Experience with other NoSQL databases, knowledge of data warehousing and business intelligence concepts, familiarity with data governance and compliance requirements, as well as experience with automation and scripting for database administration tasks would be considered as bonus points for this position.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
We are seeking a highly skilled MySQL Database (Aurora) Expert to assist in identifying, troubleshooting, and optimizing performance bottlenecks within our database infrastructure. Our application caters to high traffic and real-time transactions, necessitating an expert who can uphold stability, scalability, and efficiency. Preferred candidates will have experience in Field Management or similar high-transaction applications. Your responsibilities will include: Performance Tuning & Optimization: - Analyzing slow queries, deadlocks, and high-latency transactions. - Optimizing database indexing, partitioning, and caching strategies. - Fine-tuning Aurora MySQL configurations for enhanced availability and scalability. High-Traffic & Real-Time Data Handling: - Ensuring efficient query execution for high-concurrency workloads. - Implementing connection pooling and replication strategies. - Optimizing read/write operations to support real-time transactions. Scalability & Availability: - Designing and implementing sharding, clustering, and failover strategies. - Collaborating on AWS Aurora Multi-AZ deployments for fault tolerance. - Identifying and resolving bottlenecks in replication and read scaling. Database Monitoring & Troubleshooting: - Utilizing tools like Datadog, CloudWatch, and Performance Schema for performance tracking. - Investigating and resolving database outages, locking issues, and contention. - Proactively monitoring and maintaining query performance metrics. Collaboration & Best Practices: - Working closely with backend engineers to optimize queries and schema design. - Ensuring proper indexing, normalization, and schema changes without downtime. - Implementing backup, disaster recovery, and failover strategies. Required Skills & Experience: - Demonstrated experience managing high-traffic MySQL (Aurora) databases. - Profound knowledge of MySQL internals, query optimization, and indexing strategies. - Expertise in AWS Aurora MySQL, including read replicas, failover, and scaling. - Experience with real-time, high-transaction applications (Field Management preferred). - Proficiency in query profiling, performance tuning, and troubleshooting. - Hands-on experience with replication, sharding, and clustering. - Familiarity with monitoring tools (Datadog, Percona Toolkit, CloudWatch, etc.). - Strong understanding of connection pooling, caching (Redis, Memcached), and read/write optimizations. - Ability to analyze and resolve database locks, deadlocks, and slow queries. - Experience with database migrations and schema changes in a production environment. Preferred Qualifications: - Experience with NoSQL databases (DynamoDB, MongoDB) for hybrid architectures. - Previous experience in Field Service Management or Real-Time Tracking Applications.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Lead Data Engineer with over 7 years of experience, you will be responsible for designing, developing, and maintaining data pipelines, ETL processes, and data warehouses. You will be based in Hyderabad and should be able to join immediately. Your primary skills should include proficiency in SQL, Python, PySpark, AWS, Airflow, Snowflake, and DBT. Additionally, you should have a Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field. In this role, you will need a minimum of 4+ years of hands-on experience in data engineering, ETL, and data warehouse development. You should have expertise in ETL tools such as Informatica Power Center or IDMC, as well as strong programming skills in Python and PySpark for efficient data processing. Your responsibilities will also involve working with cloud-based data platforms like AWS Glue, Snowflake, Databricks, or Redshift. Proficiency in SQL and experience with RDBMS platforms like Oracle, MySQL, and PostgreSQL are essential. Familiarity with data orchestration tools like Apache Airflow will be advantageous. From a technical perspective, you are expected to have advanced knowledge of data warehousing concepts, data modeling, schema design, and data governance. You should be capable of designing and implementing scalable ETL pipelines and have experience with cloud infrastructure for data storage and processing on platforms such as AWS, Azure, or GCP. In addition to technical skills, soft skills are equally important. You should possess excellent communication and collaboration skills, be able to lead and mentor a team of engineers, and demonstrate strong problem-solving and analytical thinking abilities. The ability to manage multiple projects and prioritize tasks effectively is crucial. Preferred qualifications for this role include experience with machine learning workflows and data science tools, certifications in AWS, Snowflake, Databricks, or relevant data engineering technologies, as well as familiarity with Agile methodologies and DevOps practices.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
andhra pradesh
On-site
As a Database Architect at our organization, you will be responsible for designing and implementing robust, scalable, and secure database systems tailored to meet the business requirements. Your key responsibilities will include developing and optimizing database structures, collaborating with cross-functional teams to understand data requirements, and creating efficient database solutions. It will be essential for you to monitor database performance, troubleshoot issues, and implement performance tuning techniques. In this role, you will define and enforce database standards, policies, and procedures to ensure consistency and reliability across the organization. You will also be involved in data migration, integration, and ensuring data integrity across different platforms. Additionally, you will work on backup, recovery, and disaster recovery strategies for databases to ensure high availability and business continuity. As a Database Architect, you will be expected to research and implement new database technologies and techniques to optimize business processes and support growth. You will review database design and implementation to ensure compliance with best practices, security standards, and regulations such as GDPR. Conducting regular audits of database systems and providing recommendations for improvements will also be part of your responsibilities. To qualify for this role, you should have proven experience as a Database Architect or a similar role, with strong knowledge of database technologies including SQL, NoSQL, relational databases, and distributed databases. Proficiency in database design, performance tuning, troubleshooting, and experience with cloud database solutions and containerized databases will be beneficial. Expertise in data modeling, schema design, and relational database management systems is essential. Preferred qualifications include a Bachelor's degree in Computer Science or a related field, experience with big data technologies, familiarity with database automation tools, and knowledge of data governance and compliance standards. Strong analytical, problem-solving, and communication skills are key requirements for this role. If you thrive in a collaborative, fast-paced environment and have 5-9 years of relevant experience, we would like to hear from you. This is a full-time position located in Visakhapatnam. If you meet the requirements and are ready to take on this challenging role, we encourage you to apply for Job ID 1007.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As an experienced PostgreSQL Developer at SolvEdge, you will play a crucial role in designing, developing, and optimizing PostgreSQL databases to support high-performance applications in the healthcare sector. Your expertise in database development and optimization will contribute to delivering reliable and scalable data solutions. Collaborating with architects, backend engineers, and business analysts, you will ensure seamless data access through APIs and services, implement ETL processes, and translate business requirements into database logic. Your responsibilities will include designing and implementing efficient PostgreSQL schemas, indexes, constraints, and relationships, developing advanced SQL queries, stored procedures, views, and triggers, and optimizing complex queries for scalability and speed. You will create and maintain logical and physical data models based on business requirements, define data consistency standards, and implement validation rules to ensure data accuracy. Additionally, you will utilize tools for database versioning, automate database deployments within CI/CD pipelines, and monitor emerging PostgreSQL features for continuous improvement. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Engineering, or a related technical field, and possess 4-6 years of professional experience in PostgreSQL database development. Experience in Agile/Scrum environments, exposure to microservices, and cloud-native applications is advantageous. Your primary skills should include strong proficiency in PostgreSQL and advanced SQL, expertise in performance tuning, schema design, data integration, JSON/JSONB, and ORMs like Sequelize, Hibernate, or SQLAlchemy. Secondary skills encompass working with cloud-based PostgreSQL, RESTful APIs, NoSQL alternatives, CI/CD, DevOps, analytical skills, and effective communication. At SolvEdge, we value diversity and encourage passionate individuals with diverse perspectives to apply, as we believe in fostering growth within our organization. If you are ready to make a difference in healthcare technology projects and empower healthcare professionals with advanced tools and insights, submit your resume and a cover letter highlighting your qualifications and relevant experience. We look forward to hearing from you and welcoming you to our team dedicated to pioneering the future of digital healthcare. SolvEdge is an equal opportunity employer committed to creating an inclusive environment for all employees.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
kannur, kerala
On-site
You are a highly skilled and experienced Senior PHP Laravel Developer sought to join a dynamic team for the pivotal role of developing and maintaining a cutting-edge, SAAS-based product. Your passion lies in building scalable, secure, and efficient applications leveraging the full potential of AWS. Your responsibilities will include developing and maintaining the SAAS-based product on the PHP Laravel framework, designing and implementing scalable and secure applications on AWS, and ensuring optimal performance and reliability. You will also be tasked with developing and maintaining automated testing procedures to ensure the quality and functionality of the application, applying best practices in code development for clean, maintainable, and efficient code, and collaborating with cross-functional teams to analyze requirements, design solutions, and troubleshoot issues. To excel in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, with a minimum of 2 years of development experience, focusing on the Laravel framework. Your skill set should include hands-on experience with schema design, query optimization, and REST API, along with profound knowledge of AWS services such as EC2, S3, RDS, Lambda, Redis. Additionally, you should have demonstrated experience in designing scalable and secure web applications, expertise in automated testing frameworks, a strong understanding of web security practices, and proficiency in code versioning tools like Git. Excellent problem-solving skills, the ability to work in a fast-paced environment, strong communication and teamwork skills, and exceptional documentation skills are also required. In return, we offer a competitive salary and benefits package, a dynamic and supportive work environment, opportunities for professional growth and development, and the chance to work on an innovative product with a talented team. If you are passionate about developing high-quality, secure, and scalable applications and seeking a challenging opportunity for professional growth, we encourage you to apply by submitting your resume and a cover letter to our HR department. We are an equal-opportunity employer that values diversity and does not discriminate based on disability status. This is a full-time position with an in-person work location.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
The role of SQL DBA requires a candidate with 7-10 years of experience in SQL Server, with advanced knowledge in Object-Oriented Programming (OOPs), schema design, index strategies, performance tuning, and data modeling. As a Senior Database Developer, you will be responsible for designing, developing, and optimizing complex SQL Server databases. This includes creating schema designs, implementing index strategies, tuning performance, and modeling data effectively. The ideal candidate should possess a strong command of T-SQL, query optimization, and transaction management. You will be expected to author, review, and maintain complex stored procedures, functions, triggers, and views for critical applications. Additionally, you will collaborate closely with backend development teams to ensure seamless integration between application and database logic. Your responsibilities will also include implementing database best practices for security, integrity, backup/recovery, and high availability. You should have a proven track record in database schema design, normalization/denormalization, indexing, and enforcing data integrity. Moreover, you must be adept at troubleshooting and optimizing slow-performing queries and processes in high-volume and transactional environments. The successful candidate will have practical experience in integrating with backend applications using Java or .NET (C#, ASP.NET). Exposure to ETL processes and data migration, preferably using SSIS or equivalent tools, is desirable. Proficiency in source control (Git, TFS) and Software Development Life Cycle (SDLC) best practices is essential. Excellent analytical, problem-solving, and troubleshooting skills are also key requirements for this role. If you have a passion for database development, a strong foundation in SQL Server, and the ability to work collaboratively with backend development teams, we encourage you to apply for this challenging and rewarding position.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Architect at Diageo, you will play a crucial role in contributing to the transformation of our business capabilities through data and digital technology. You will be responsible for analyzing the overall IT landscape and various technologies to ensure seamless integration. Your expertise in data modeling, schema design, and data architectures will be essential in driving our enterprise data management, data warehouse, and business intelligence initiatives. You will have the opportunity to review data models for completeness, quality, and adherence to established architecture standards. Your strong capabilities in comparing and recommending tools and technologies will be instrumental in enhancing our data management processes. Additionally, your proficiency in metadata maintenance and data catalog management will contribute to the overall efficiency of our data systems. Preferred qualifications for this role include experience with Databricks Lakehouse architecture, expertise in working with file formats such as Parquet, ORC, AVRO, Delta, and Hudi, and exposure to CI/CD tools like Azure DevOps. Knowledge and experience with Azure data offerings will be beneficial in effectively leveraging our data resources. If you are passionate about leveraging data and technology to drive business growth and innovation, and if you thrive in a dynamic and collaborative environment, we invite you to join our team at Diageo. Your contributions will play a key role in shaping the future of our digital and technology initiatives.,
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Power BI Engineer at Ford Motor Company, you will play a vital role in the modernization and transformation of the PLM landscape by migrating legacy datacenter-based BI solutions to Power BI. Your responsibilities will include designing, developing, deploying, and maintaining BI applications to ensure seamless integration and optimal performance. With over 10 years of total experience and at least 5 years of hands-on experience in Web FOCUS 8.* reports application, you will utilize your expertise to create reports, dashboards, and visualizations using Web FOCUS 8.*. Your deep knowledge in data modeling, schema design, and Big Query optimization will be crucial for building and maintaining scalable, secure, and reliable BI solutions. Collaboration with architects, engineers, and operations teams is essential to deliver high-quality cloud-based solutions. Proficiency in JavaScript, HTML, CSS, and Google Cloud Platform Data Engineering services is required for customizing Web FOCUS reports and ensuring optimal performance. Your analytical and problem-solving skills will be put to use for performance tuning, query optimization, and troubleshooting. Preferred qualifications include familiarity with Google Cloud Data Engineering Services, ETL tools, and data pipelines. You may also have experience with automation, scripting, and other BI tools such as Power BI or Tableau. Your role will involve migrating Web Focus 8.* reports to Web Focus 9.*, designing, developing, and optimizing Web FOCUS 9.* Reports, dashboards, and analytics solutions. Working closely with business stakeholders, you will gather requirements and deliver actionable insights through data visualization. Efficient and scalable report queries will be developed for performance optimization with backends like Big Query, PostgreSQL, SQL Server. Integration of Web FOCUS with various data sources, databases, and cloud platforms will be a part of your responsibilities to ensure data integrity, governance, and security. Troubleshooting and resolving issues related to report performance, data accuracy, and user accessibility are key tasks. Collaboration with BI developers, data engineers, and analysts will enhance data reporting capabilities. Documentation of solutions, providing training and support to end-users and junior developers, and staying up to date with the latest Web FOCUS and BI industry trends will be expected from you.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
About At Dawn Technologies At At Dawn Technologies, we are a niche company dedicated to providing specialized tech solutions in Gen AI, Data Engineering, and Backend Systems. We stand out by focusing on delivering innovative solutions that prioritize flexibility, independence, and real technological depth. Our core belief is that exceptional results are achieved through the combination of teamwork and talent. We are seeking individuals who are great problem-solvers and enjoy collaborating within a team to deliver significant outcomes. Job Overview As a Senior Solution Architect specialized in Database Technology at At Dawn Technologies, you will play a crucial role in architecting, designing, and implementing complex database solutions for our enterprise clients. You will utilize your expertise in both relational and non-relational database systems to develop scalable, secure, and high-performance solutions that align with our clients" business requirements. Responsibilities Solution Architecture & Design: - Lead the design and implementation of database solutions, focusing on scalability, performance, availability, and security. - Offer expert guidance on selecting and integrating database technologies based on client needs. - Design and deliver reliable, high-performance, and secure database architectures tailored to specific business use cases. - Develop detailed system architecture blueprints including data models, database schemas, and system integrations. Consultation & Collaboration: - Collaborate with business stakeholders, technical teams, and project managers to translate business requirements into technical solutions. - Provide consulting services to clients on database management best practices, optimization, and scaling. - Support pre-sales activities through technical presentations, scoping, and project estimation. Technology Leadership: - Stay updated on emerging database technologies and trends, recommending innovative solutions. - Mentor junior architects and developers, promoting the adoption of best practices in database design and implementation. - Lead troubleshooting and performance optimization efforts to ensure database systems meet defined service-level agreements (SLAs). Project & Vendor Management: - Oversee the successful implementation of database solutions across projects, aligning with architecture principles, timelines, and budgets. - Manage relationships with third-party database vendors and tools to ensure effective integration into the architecture. - Contribute to the development and management of the project roadmap for timely and cost-effective solution delivery. Security & Compliance: - Ensure database solutions adhere to industry standards and compliance regulations. - Implement data protection measures including encryption, backup strategies, and disaster recovery planning. - Conduct regular reviews of database security and performance. Required Skills & Experience Technical Expertise: - Proficiency in relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra, DynamoDB). - Strong understanding of cloud-based database platforms (AWS RDS, Azure SQL Database, Google Cloud SQL). - Expertise in database design, optimization, and scaling for transactional and analytical workloads. - Familiarity with data warehousing, ETL processes, and data integration technologies. Architecture & Solution Design: - Proven experience in architecting large-scale, high-performance database systems. - Knowledge of data modeling, schema design, and database performance tuning. - Experience with cloud-native architectures and distributed database systems. Tools & Technologies: - Proficiency in database automation, monitoring, and management tools (e.g., Liquibase, Flyway, DBeaver, Prometheus). - Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). - Familiarity with DevOps pipelines for database deployments and CI/CD practices. Leadership & Communication: - Strong leadership and mentoring skills to guide cross-functional teams. - Excellent communication and presentation abilities to convey technical concepts to diverse stakeholders. - Capacity to manage multiple projects concurrently and effectively prioritize tasks. Certifications: - Oracle Certified Architect or relevant database certifications. - Cloud certifications (AWS Certified Solutions Architect, Microsoft Certified: Azure Solutions Architect Expert). Benefits & Perks For Working At At Dawn - Performance-based compensation. - Remote-first work environment. - Comprehensive healthcare benefits. - Generous paid time off and flexible leave policies. - Access to learning & development resources. - Culture centered on innovation, excellence, and growth. At At Dawn Technologies, we foster a workplace culture that values innovation, ownership, and excellence. We are committed to being an equal-opportunity employer that promotes diversity and inclusion. Join us in redefining the future!,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
You will be joining a Global Product Engineering Services and Digital Transformation Services company with at least 5 years of experience. Your responsibilities will include working with custom metadata, custom settings, and schema design to improve application scalability. It will be essential to ensure application reliability by writing unit tests with Apex Test Classes to maintain high code coverage and following secure coding best practices such as enforcing data protection, field-level security, and platform encryption. Collaboration will be a key aspect of your role as you will closely collaborate with cross-functional teams, including administrators, business analysts, and front-end developers. You will be using Git for version control and managing CI/CD pipelines through Salesforce DevOps tools like Gearset, Copado, or Azure DevOps. Your problem-solving skills will be put to the test as you demonstrate strong debugging abilities using Salesforce Debug Logs and Apex Replay Debugger. Additionally, effective communication of technical solutions to both technical and non-technical stakeholders will be crucial for success in this role.,
Posted 2 weeks ago
5.0 - 9.0 years
18 - 25 Lacs
bengaluru
Hybrid
Role & responsibilities Develops and maintains scalable data pipelines to support continuing increases in data volume and complexity. Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increase data accessibility and fostering data-driven decision making across the organization. Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Writes unit/integration tests, contributes to engineering wiki, and documents work. Perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Works closely with a team of front-end and backend engineers, product managers, and analysts. Defines company data assets (data models), spark, spark SQL, and hive SQL jobs to populate data models. Designs data integrations and data quality framework Required Skills & Qualifications - BS or MS degree in Computer Science or a related technical field 5+ years of Python or Java development experience 5+ years of SQL experience (No-SQL experience is a plus) 5+ years of experience with schema design and dimensional data modelling 5+ years of experience with Big Data Technologies like Spark, Hive 2+ years of experience on data engineering on Google Cloud platform services like big query.
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have hands-on experience working with Columnar Databases (e.g., ClickHouse), Relational Databases (e.g., MySQL, PostgreSQL), and Document Databases (e.g., MongoDB). It is essential to possess proven expertise in setting up, configuring, scaling, and maintaining ClickHouse and MongoDB clusters. A strong background in schema design and development customized for specific use cases to ensure optimal performance and scalability is required. You must demonstrate the ability to optimize database performance through query tuning, indexing strategies, and resource management. Additionally, familiarity with backup, disaster recovery, and monitoring best practices for maintaining high-availability database environments is crucial.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
chandigarh
On-site
The ideal candidate should have strong proficiency in Python with experience in Pandas. You should have a minimum of 2-3 years of practical experience with data projects and be well-versed in SQL and modern database technologies. Proficiency in version control systems, statistical methods, and practical application of machine learning concepts, especially in NLP, is required. Experience with API development and integration, as well as expertise in the Azure cloud platform (AWS/GCP knowledge is a plus), is essential. You should also be familiar with data visualization tools like matplotlib and seaborn, data pipeline development, ETL processes, data warehouse design, and implementation. Proficiency in building and maintaining data pipelines, data modeling, schema design, and experience with ETL tools like Azure Data Factory is necessary. Knowledge of data lake architectures, data cleaning, preprocessing skills, and big data technologies such as Spark and Databricks is a plus. In addition, you should have the ability to optimize database queries and data processes, experience with real-time data processing, and strong analytical skills focusing on data quality and validation. Experience with prompt engineering, LLM application development, AI model deployment, integration, and knowledge of responsible AI practices are desired. A Bachelor's or Master's degree in Computer Science, Data Science, or a related field is required, along with evidence of continuous learning such as certifications and online courses. Previous healthcare data experience and Azure certifications are highly valued. You must grasp data privacy and security principles, understand GDPR fundamentals, data protection principles, and have knowledge of handling sensitive personal data. Strong English communication skills, experience with remote work tools, and asynchronous communication are necessary, along with the ability to work across time zones and be a self-motivated, independent worker with strong documentation habits. Collaboration and communication skills are essential, including experience working with development teams, code review, technical specification writing, proactive communication, problem-solving mindset, and the ability to explain technical concepts to non-technical stakeholders. Experience with project management tools like Jira and Trello is beneficial. This is a full-time position with benefits including Provident Fund. The work location is in person, and the expected start date is 01/03/2025.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Database Design and Implementation specialist, you will be responsible for designing and developing database structures that align with specific business requirements. Your focus will be on ensuring data integrity and consistency through the implementation of proper schema design and constraints. Additionally, you will optimize database performance to enhance speed and efficiency. In Database Maintenance and Management, you will monitor database performance to identify areas for improvement. Your duties will include implementing and managing database backups and recovery procedures, as well as maintaining data security through access controls, encryption, and other security measures. You will also be responsible for updating and patching database software and systems. Troubleshooting and Problem Solving will be a key part of your role, as you will be tasked with identifying and resolving database performance issues and errors. You will investigate and address data corruption or integrity problems while providing technical support to end-users and developers regarding database-related issues. Other Responsibilities include creating and maintaining database documentation, training others in database management techniques, and collaborating with developers and IT staff to ensure that database needs are effectively met. This is a full-time, permanent position with benefits such as food provision, health insurance, and Provident Fund. The work schedule is during the day, and additional benefits include performance bonuses and yearly bonuses. Please note that the work location for this position is in-person.,
Posted 2 weeks ago
2.0 - 7.0 years
0 - 0 Lacs
bangalore, noida, chennai
On-site
Adobe AEP Developer Adobe AEP Implementation Key Responsibilities Work with development, deployment, and production areas of the business to craft new solutions using Adobe Experience Platform, Adobe Analytics, Audience Manager, Data Workbench, and Adobe Analytics. Interface directly with internal teams, to address and manage client requests and communicate status in person, via phone and/or email. Be able to understand, customize and optimize the entire customer journey and data management process, to include data management solutions for ingestion of customer data from 1 st , 2 nd and 3 rd party data sources. Capable of developing reports, dashboards and metrics analysis to deliver actionable business intelligence to marketing groups. Document the solution and train our clients staff and other staff to perform the ongoing runtime analytics. Key Skills and Experience Bachelors degree in Information Systems, Computer Science, Data Science or prior experience in analytics development systems Hands on experience with Adobe Experience Platform (AEP) Familiarity and/or experience with: Adobe Analytics, Audience Manager, Journey Optimizer Adobe Experience Platform Developer or Architect Certification preferred Detail-oriented and organized - able to balance changing workload Excellent verbal communication and comprehension skills Ability to quickly master tracking software of third-party vendor applications from various analytics and advertising technologies such as Chartbeat, Optimizely, Krux, comScore, Nielsen, Facebook, and other
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
You should have a Bachelor's degree or equivalent practical experience along with at least 5 years of experience in system architecture and code comprehension reading in one programming language such as Java, C++, or Python. Additionally, you should have 5 years of experience troubleshooting technical issues for internal/external partners or customers, as well as expertise in data analysis and SQL/MySQL. It is essential to have 5 years of experience in analytics and analytics development, including working with relational databases and business intelligence tools like Looker, Tableau, or PowerBI. In terms of preferred qualifications, a Master's degree or equivalent practical experience is desired. You should have a minimum of 10 years of experience creating dashboards, manipulating datasets, and performing data analysis using SQL and data visualization tools such as Looker, Tableau, PowerBI, or Data Studio. Experience in working with product adoption, business, or media consumption data, and familiarity with Python-based APIs and utilities is preferred. Moreover, expertise in using programming languages like Python, Java, or C++ to deliver or manage data pipelines is an advantage. Experience in converting business requests into technical requirements for insightful data products and deploying data products effectively is also beneficial. As a part of the YouTube Analytics and Data Science team, you will play a crucial role in driving robust business decisions through analysis, data democratization, and actionable insights. You will collaborate with various teams to develop scalable data products and deliver valuable insights for partner engagement, creator and consumer support, and go-to-market strategies across YouTube's relationships with creators, brands, media companies, and distributors. Your responsibilities will include leading the end-to-end development and deployment of centralized reports, dashboards, models, and tools to analyze and present data related to partner management, creator and consumer support. You will work closely with different teams to identify key business questions and process pain points where data products can enhance YouTube's outcomes. By translating ambiguous business requests into technical requirements, you will contribute to a prioritized roadmap and collaborate with Data Infrastructure and Engineering teams to release product features that empower stakeholders to improve support outcomes. Overall, you will be instrumental in delivering data products to Partner Management and User Support teams to drive better decision-making. Your role will involve creating Key Performance Indicators and key results for the YouTube Business Organization, providing strategic and operational recommendations to achieve organizational goals.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |