Jobs
Interviews

7 Data Schemas Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Optum is a global organization dedicated to delivering care and utilizing technology to improve the lives of millions of individuals. Your contributions to our team will have a direct impact on health outcomes by facilitating access to care, pharmacy benefits, data, and resources essential for overall well-being. Embrace a culture that values diversity and inclusion, collaborate with skilled colleagues, enjoy comprehensive benefits, and explore opportunities for career growth. Join us in making a difference in the communities we serve and promoting health equity on a global scale. Let's embark on this journey of Caring, Connecting, and Growing together. As a part of our team, your primary responsibilities will include analyzing business requirements to devise solutions for Clinical & Reporting Platform systems. You will be expected to troubleshoot and resolve production issues by debugging code, examining logs, conducting reverse engineering to identify root causes, and proposing permanent fixes within the specified Service Level Agreement (SLA). Your role will also involve providing support to ensure service availability, stability, and application performance. To excel in this position, you should possess a graduate degree or equivalent experience along with the following qualifications: - 4 years of experience with Oracle PL/SQL - 4 years of experience with SQL - 4 years of experience with production operational support - 3 years of experience with Unix scripting - Proficiency in adapting to a rapidly evolving environment and driving technological innovation to meet business needs - Familiarity with Agile, Scrum, and Kanban principles - Strong grasp of software engineering processes including design patterns, analysis, data schemas and queries, system design, unit testing, code reviews, agile methodologies, and dev-ops - Demonstrated ability to build relationships across diverse teams - Excellent time management, communication, and presentation skills - Knowledge of Airflow and TWSd batch jobs - Willingness to work flexible shifts and participate in on-call duties as part of a rotation - Capability to oversee code migration through different release stages - Availability for production deployments/troubleshooting during off-hours and weekends If you meet these qualifications and are interested in advancing your career with us, please consider applying through the Internal Employee Application portal. This opportunity is based in Hyderabad, Telangana, IN. Join us at Optum to contribute to the advancement of healthcare and make a positive impact on the well-being of individuals worldwide. Let's work together to create a healthier future for all.,

Posted 1 day ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

The intermediate level, under moderate supervision, analyzes, designs, and implements databases, including access methods, device allocations, validation checks, organization and security. Gathers requirements and assists in the design of data models, logical and physical databases, data dictionaries, and schemas. Assists in systems planning, scheduling, and implementation. Initiates corrective action to stay on schedule. Develops and implements data and analytic solution recovery plans and procedures. Lead cross-functional teams and collaborates with project stakeholders to successfully capture data and analytic use cases and requirements for projects and/or operational support requests for various Wolters Kluwer business units. Collaborate and build trust with business stakeholders/SME to extract requirements and influence them to the appropriate solution, as necessary. Ensure the BI team of ETL developers, report designers, and data analysts have the necessary requirements and tools to deliver the correct data solution based on best practice, security, and quality standards. Apply proven communication skills, problem-solving skills, and knowledge of best practices to guide the development team on issues related to the design to ensure they align with the business analytic objectives. Leads efforts across technology and business teams to share domain knowledge and provide input into the solutions as needed. Conduit and liaison for GBS technology teams and business stakeholders. Maintains and shares awareness of the end-to-end processes. Work with cross-functional teams to build clear project plans and drive projects/requests to completion. Ability to provide clear written and verbal communication on statuses and escalate issues/risks as needed. Leads others in solving complex technical problems using advanced analytical thought; ensures Operations teams are delivering on enhancements requests, identify root cause and fix defects, and process improvements. Work with project managers, GBS leaders, and business stakeholders to assist in resource allocation for the team based on project and/or operations resource demands. Education: Required: Bachelor's Degree in computer science or related field. Experience: 7-10 years broad Information Technology experience. Experience managing BI teams to deliver ETL, reporting, and/or data analytic solutions with a focus on Informatica Intelligent Data Management Cloud. Maintain proficiency in ETL/data integration and data reporting/analytic processes, technologies, and architectures: Dimensional data modeling principles (star and snowflake schemas, de-normalized data structures, slowly changing dimensions, etc.). Deep understanding of data warehouse design. Solid understanding of RDBMS (Oracle, DB2, MySQL or SQL Server), including data modeling and datamart design. Analyzing business needs, including scheduling meetings, planning agendas, conferring with business line leaders, and delivering presentations to senior management, peer groups, and associates.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Global Security Data Analyst at Honeywell, you will play a crucial role in partnering with leadership, stakeholders, and employees to communicate through critical dashboards where to allocate resources to secure Honeywell. Your primary responsibility will involve working across various security teams to mitigate risks using data analysis. Leveraging your expertise in data acumen and user interface design, you will lead the development and maintenance of Honeywell Global Security dashboards. Through your ability to influence with and without authority, you will contribute to achieving our common goal of safeguarding the company. Your duties will encompass data analysis, mapping, integration, quality assurance, and dashboard development. You will be the key support for our critical dashboards, providing a unique opportunity to collaborate with multiple security teams to gain a comprehensive view of risks at Honeywell. Your role will involve driving a data strategy to ensure accuracy, efficiency, timeliness, and consistency in our data feeds. Key Responsibilities: - Utilize analytical and technical skills to translate business requirements into specifications for designing and implementing Honeywell Global Security dashboards. - Collaborate with data and service owners to determine data needs, and create, maintain, and optimize data pipelines for structured and unstructured data, facilitating its movement to a centralized enterprise data warehouse. - Work with security service owners to automate dashboards using APIs and streamline manual data processes where feasible. - Develop data models and schemas to support analytical and reporting requirements. - Operate independently with minimal supervision, following high-level directives from leadership. - Document standard processes and troubleshooting procedures. - Provide end-user support to minimize disruptions and promote system utilization. - Monitor data extraction, transformation, and loading processes. Requirements: - Bachelor's degree in computer science or a related field, or equivalent experience of at least 3 years. - Minimum of 3 years of proven data management experience, including merging, cleaning, and analyzing large datasets, as well as developing complex dashboards. Proficiency in handling both structured and unstructured data. - Advanced knowledge of Microsoft Excel.,

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Data Integration Architect, you will be responsible for designing and developing data integration solutions to support business processes and analytics. You will collaborate with stakeholders to understand data requirements and translate them into technical specifications. Your role will involve implementing data integration strategies, including ETL processes, data pipelines, and APIs, while ensuring data quality, consistency, and security across all integration points. In this position, you will be expected to develop and maintain data models, schemas, and documentation. It will be your responsibility to monitor and optimize data integration performance and troubleshoot any issues as they arise. Staying up-to-date with industry trends and emerging technologies in data integration and architecture will also be a key part of your role. Moreover, as a Data Integration Architect, you will provide technical guidance and mentorship to junior team members. Your educational background should include a B. Tech./B.E in Computer Science, Information Technology, or Electronics. Additionally, having certifications such as TOGAF or other relevant Data and Integration Architect Certifications will be beneficial. To excel in this role, you should have over 5 years of experience in Data and Integration Architecture Design & Implementation. Strong knowledge of data integration tools and technologies like Informatica, Talend, and MuleSoft is required. Proficiency in SQL and database management systems such as Oracle, SQL Server, and MySQL is essential. Experience with cloud platforms like AWS, Azure, and Google Cloud, as well as data warehousing solutions, is desired. Excellent communication and presentation skills are a must for effective interaction with stakeholders. Additionally, personal and time management skills are crucial for managing project timelines and deliverables. Knowledge about Information Security Audits and processes will be an added advantage. Being a team player is essential for collaborative work within the organization.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

bhubaneswar

On-site

As a part of the HAECO team, you will be at the forefront of the aviation industry, contributing to the safe and efficient operation of aircraft, engines, and components globally. We are dedicated to providing sustainable value to all our stakeholders, and we are seeking enthusiastic individuals to join us in upholding the standards of aviation excellence. In your role as Tooling Support Manager (TSM), you will report directly to the Head of Section in Tooling Management. Your primary responsibility will be to provide engineering support to the Maintenance Division, ensuring that tooling issues are addressed promptly and effectively. You will oversee tooling-related projects, ensuring that they are completed within the specified time frame and budget constraints. Your daily responsibilities will include providing technical assistance to the Maintenance Division staff, engaging with customers on technical matters, managing tooling projects to ensure timely completion, and collaborating with aircraft manufacturers, airlines, and tooling OEMs. Additionally, you will play a key role in reviewing, developing, and updating internal tooling policies and instructions to align with regulatory requirements. Your ability to prepare engineering documents in compliance with company and regulatory standards will be crucial to your success in this role. To excel in this position, you should have a minimum of 4 years of experience in aircraft maintenance tooling management or a related field. You must possess strong supervisory skills, excellent interpersonal and communication abilities, and a collaborative approach to teamwork. Sound business acumen and project management skills will be essential in managing tooling-related projects effectively. A tertiary degree in engineering or a relevant discipline is required for this role. Additionally, advanced knowledge of computer software, proficiency in Microsoft Excel (including formula and filter usage), and familiarity with programming languages such as Python, databases like SQL, and data schemas like XML will be advantageous. Join us at HAECO to embark on a rewarding career journey that delivers sustainable value to the community and our stakeholders. If you do not hear from us within 4-6 weeks of submitting your application, please consider your candidacy unsuccessful. Rest assured that all information shared during the application process will be handled with strict confidentiality and used solely for employment purposes.,

Posted 1 month ago

Apply

4.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Our vision is to transform how the world uses information to enrich life for . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities and Tasks Understand the Business Problem and the Relevant Data Maintain and understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting options Architect Data Management Systems Use understanding of the business problem and the nature of the data to select appropriate data management systems (Big Data, OLTP, OLAP, etc.) Design and implement optimum data structures in the appropriate data management system (GCP BQ, Snowflake, SQL server etc.) to satisfy the data requirements Plan methods for archiving/deleting of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data Identify and select the optimum methods of access for each data source (real-time/streaming, batch, delayed, static) Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model Develop processes to efficiently load the transformed data into the data management system Prepare Data to Meet Analysis Requirements Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) Develop and code data extracts Follow standard methodologies to ensure data quality and data integrity Ensure that the data is fit to use for data science applications Qualifications and Experience: 4-7 years of experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems and generic tools. Experienced in developing ETL/ELT processes using Apache Ni-Fi, Cloud solutions like GCP, Big Query and Snowflake or any equivalent etc. Significant experience with big data processing and/or developing applications and data sources using different cloud services etc. Experienced in integration with different Ingestion, Scheduling, logging, Alerting and Monitoring cloud services. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Cloud DBs like BQ, Snowflake, AlloyDB or any equivalents etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, multi-functional environment Minimum of 2 years experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.) at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.) one or more Data Extraction Tools (SSIS, Informatica etc.) Software development Ability to travel as needed Education: B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

kochi, kerala

On-site

You should have 8-12 years of experience in a Data Engineer role, with at least 3 years as an Azure data engineer. A bachelor's degree in Computer Science, Information Technology, Engineering, or a related field is required. You must be proficient in Python, SQL, and have a deep understanding of PySpark. Additionally, expertise in Databricks or similar big data solutions is necessary. Strong knowledge of ETL/ELT frameworks, data structures, and software architecture is expected. You should have proven experience in designing and deploying high-performance data processing systems and extensive experience with Azure cloud data platforms. As a Data Engineer, your responsibilities will include designing, constructing, installing, testing, and maintaining highly scalable and robust data management systems. You will apply data warehousing concepts to design and implement data warehouse tables in line with business requirements. Building complex ETL/ELT processes for large-scale data migration and transformation across platforms and Enterprise systems such as Oracle ERP, ERP Fusion, and Salesforce is essential. You must have the ability and expertise to extract data from various sources like APIs, JSONs, and Databases. Utilizing PySpark and Databricks within the Azure ecosystem to manipulate large datasets, improve performance, and enhance scalability of data operations will be a part of your role. Developing and implementing Azure-based data architectures consistent across multiple projects while adhering to best practices and standards is required. Leading initiatives for data integrity and normalization within Azure data storage and processing environments is expected. You will evaluate and optimize Azure-based database systems for performance efficiency, reusability, reliability, and scalability. Additionally, troubleshooting complex data-related issues within Azure and providing expert guidance and support to the team is necessary. Ensuring all data processes adhere to governance, data security, and privacy regulations is also a critical part of the role.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies