Jobs
Interviews

5 Mysql Workbench Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced Oracle DBA with a strong background in Exadata and RAC, you will leverage your expertise to optimize database performance and implement backup and recovery strategies. Your problem-solving skills will be key as you collaborate effectively within a team environment. A Bachelor's degree in Computer Science, Information Technology, or a related field will be beneficial for this role. Additionally, your ability to communicate technical concepts to non-technical stakeholders is essential. With at least 8 years of experience in Oracle DBA roles and 3+ years in MySQL DBA roles, you will have a proven track record in OCI and Exadata Cloud Service migrations. Your understanding of RPO/RTO, disaster recovery planning, and replication strategies will be crucial. Experience with monitoring tools such as OEM, MySQL Workbench, and Datadog is required, along with strong troubleshooting, documentation, and communication skills. In this role, your responsibilities will include leading the migration of Oracle databases to Exadata Cloud Service on OCI, designing low-downtime migration strategies, and implementing performance tuning and issue resolution techniques. You will manage Oracle RAC, ASM, backups, patching, cloning, and DR environments while collaborating with SRE/DevOps/Application teams for production support. Additionally, you will administer MySQL databases, set up and monitor replication and HA solutions, and ensure query performance tuning and backup/restore strategies. At GlobalLogic, you will experience a culture that prioritizes caring, learning, and development. You will have the opportunity to work on interesting and meaningful projects, grow personally and professionally, and achieve a balance between work and life. As part of a high-trust organization, you will be joining a safe, reliable, and ethical global company that values integrity and trust in everything it does. GlobalLogic, a Hitachi Group Company, is a digital engineering partner that collaborates with some of the world's largest companies to create innovative digital products and experiences. Join us in transforming businesses and industries through intelligent solutions and services.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning, and modeling. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Knowledge of enterprise databases such as DB2/Oracle/PostgreSQL/MYSQL/SQL Server. Hands-on knowledge and experience with tools and techniques for analysis, data manipulation, and presentation (e.g. PL/SQL, PySpark, Hive, Impala, and other scripting tools). Experience with Software Development Lifecycle using the Agile methodology. Knowledge of agile methods (SAFe, Scrum, Kanban) and tools (Jira or Confluence). Expertise in conceptual modeling; ability to see the big picture and envision possible solutions. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. We're committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events, yoga challenges, or marathons. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead PowerBI Engineer, you will be responsible for working primarily with Power BI & Qlik Sense along with other Business Intelligence tools such as Tableau, Micro Strategy, Alteryx, or similar visualization technologies to present complex data in an easily digestible format. Your role will involve utilizing your experience in data analysis and writing SQL/MYSQL Workbench queries using joins, unions, and analytical functions. You will be expected to perform data management, mining, and manipulation tasks, utilizing SQL for data manipulation purposes. Additionally, your responsibilities will include BI solutioning to design, develop, and maintain user-friendly data visualizations and dashboards using complex datasets from various sources in Business Intelligence Tools. Collaboration with stakeholders to understand requirements and deliver comprehensive BI Solutions will be a key aspect of your role. You will also be required to connect Power BI to diverse data sources like Databricks, Oracle, and SharePoint, and design data models to facilitate complex data analysis. Using Dax & Power Query for advanced data analysis and creating calculated columns, measures & tables will be part of your routine tasks. Furthermore, you will develop Power BI models for report creation and dashboards, as well as create and execute manual or automated test cases and analyze the test results. Migration of dashboards from Qlik Sense to Power BI and working closely with fellow developers and project leads to coordinate tasks & deliverables are also part of your responsibilities. Your expertise should include the ability to work on multi-relational database structures, with a strong background in the Oracle Database environment, SQL, and PL/SQL programming. Designing and troubleshooting complex queries, stored procedures, functions, views, indexes, constraints, unions, joins, and similar development tasks will be part of your daily activities. Moreover, contributing to architecture, database design, and query optimization, as well as reviewing the database architecture of the product to achieve the required technical outcome, will be expected from you. Experience in Data Lake systems using AWS cloud technologies, particularly Data bricks, will be advantageous. Your role will also involve extensive data modeling and metadata experience, specifically designing logical models and data dictionaries based on business requirements and reverse-engineering. Hands-on experience in contributing to data/application architecture & designs, software/enterprise integration design patterns, and full-stack knowledge, including modern distributed front-end and back-end technology stacks, will be beneficial. Finally, you will be responsible for refining the product based on feedback, ensuring the continuity of accuracy and functionality, debugging applications, tracing code, and finding and fixing bugs to maintain the integrity and performance of the systems you work on.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a PySpark Data Engineer, you must have a minimum of 2 years of experience in PySpark. Strong programming skills in Python, PySpark, and Scala are preferred. It is essential to have experience in designing and implementing CI/CD, Build Management, and Development strategies. Additionally, familiarity with SQL and SQL Analytical functions is required, along with participation in key business, architectural, and technical decisions. There is an opportunity for training in AWS cloud technology. In the role of a Python Developer, a minimum of 2 years of experience in Python/PySpark is necessary. Strong programming skills in Python, PySpark, and Scala are preferred. Experience in designing and implementing CI/CD, Build Management, and Development strategies is essential. Familiarity with SQL and SQL Analytical functions and participation in key business, architectural, and technical decisions are also required. There is a potential for training in AWS cloud technology. As a Senior Software Engineer at Capgemini, you should have over 3 years of experience in Scala with a strong project track record. Hands-on experience in Scala/Spark development and SQL writing skills on RDBMS (DB2) databases are crucial. Experience in working with different file formats like JSON, Parquet, AVRO, ORC, and XML is preferred. Previous involvement in a HDFS platform development project is necessary. Proficiency in data analysis, data profiling, and data lineage, along with strong oral and written communication skills, is required. Experience in Agile projects is a plus. For the position of Data Modeler, expertise in data structures, algorithms, calculus, linear algebra, machine learning, and modeling is essential. Knowledge of data warehousing concepts such as Star schema, snowflake, or data vault for data mart or data warehousing is required. Proficiency in using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models is necessary. Hands-on knowledge and experience with tools like PL/SQL, PySpark, Hive, Impala, and other scripting tools are preferred. Experience with Software Development Lifecycle using the Agile methodology is essential. Strong communication and stakeholder management skills are crucial for this role. In this role, you will design, develop, and optimize PL/SQL procedures, functions, triggers, and packages. You will also write efficient SQL queries, joins, and subqueries for data retrieval and manipulation. Additionally, you will develop and maintain database objects such as tables, views, indexes, and sequences. Optimizing query performance and troubleshooting database issues to improve efficiency are key responsibilities. Collaboration with application developers, business analysts, and system architects to understand database requirements is essential. Ensuring data integrity, consistency, and security within Oracle databases is also a crucial aspect of the role. Developing ETL processes and scripts for data migration and integration are part of the responsibilities. Documenting database structures, stored procedures, and coding best practices is required. Staying up-to-date with Oracle database technologies, best practices, and industry trends is essential for success in this role.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Work from Office

Notice period: Immediate 15days Profile source: Anywhere in India Timings: 1:00pm-10:00pm Work Mode: WFO (Mon-Fri) Job Summary: We are looking for an experienced and highly skilled Senior Data Engineer to lead the design and development of our data infrastructure and pipelines. As a key member of the Data & Analytics team, you will play a pivotal role in scaling our data ecosystem, driving data engineering best practices, and mentoring junior engineers. This role is ideal for someone who thrives on solving complex data challenges and building systems that power business intelligence, analytics, and advanced data products. Key Responsibilities: Design and build robust, scalable, and secure data pipelines and Lead the complete lifecycle of ETL/ELT processes, encompassing data intake, transformation, and storage including the concept of SCD type2. Collaborate with data scientists, analysts, backend and product teams to define data requirements and deliver impactful data solutions. Maintain and oversee the data infrastructure, including cloud storage, processing frameworks, and orchestration tools. Build logical and physical data model using any data modeling tool Champion data governance practices, focusing on data quality, lineage tracking, and catalog Guarantee adherence of data systems to privacy regulations and organizational Guide junior engineers, conduct code reviews, and foster knowledge sharing and technical best practices within the team. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related Minimum of 5 years of practical experience in a data engineering or comparable Demonstrated expertise in SQL and Python (or similar languages such as Scala/Java). Extensive experience with data pipeline orchestration tools (e.g., Airflow, dbt, ). Proficiency in cloud data platforms, including AWS (Redshift, S3, Glue), or GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse). Familiarity with big data technologies (e.g., Spark, Kafka, Hive) and other data Solid grasp of data warehousing principles, data modeling techniques, and performance (e.g. Erwin Data Modeler, MySQL Workbench) Exceptional problem-solving abilities coupled with a proactive and team-oriented approach.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies