Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description Malayaj Solutions is a leading information technology organisation headquartered in Bangalore. Since 2018, we have provided customised technology services to address specific client needs in areas such as development, automation, digital transformation, and managed services. Our skilled professionals are dedicated to delivering productive outcomes and exemplary customer service. Malayaj Solutions is trusted by companies and partners globally, offering innovative solutions across industries like healthcare, education, energy resources, and more. Role Description This is a full-time, on-site role located in Noida/Gurgaon for a junior database administrator. The Junior Database Administrator will be responsible for managing and maintaining databases, designing database structures, and troubleshooting database issues. Daily tasks will include setting up database replication, ensuring data integrity and security, and optimising performance. Collaboration with other IT team members to support business applications is also required. Qualifications Database Administration and Database Skills Hand-on experience with PostgreSQL or Oracle DB is a must Proficiency in Database Design, including schema design and normalization Strong Troubleshooting skills, specifically for database-related issues Experience with Database Replication and ensuring data consistency Excellent problem-solving and analytical skills Ability to work collaboratively and maintain effective communication with team members Bachelor's degree in Computer Science, Information Technology, or a related field is preferred Experience with specific database management systems (e.g., MySQL, PostgreSQL, SQL Server) Key Responsibilities: Install, configure, and maintain MS SQL Server, PostgreSQL, and Oracle databases. Manage database security, backup & recovery, and high-availability setups. Proactively monitor database performance, identify bottlenecks, and implement tuning solutions. Plan and execute database migrations, upgrades, and patch management. Collaborate with development and infrastructure teams for schema designs, query optimisation, and issue resolution. Ensure compliance with data protection regulations and internal data governance policies. Develop and maintain database documentation, standards, and procedures. Must-Have Skills: 3+ years of proven experience as a database administrator. Strong expertise in PostgreSQL , and Oracle DB Experience in database clustering, replication, and disaster recovery setups. Proficient in SQL scripting, query optimisation, and performance tuning. Familiarity with cloud-based database services (AWS RDS, Azure SQL, etc.) is a plus. Good to Have: Exposure to NoSQL databases like Redis. Experience working in an Agile/DevOps environment. Certification in MS SQL/PostgreSQL is an added advantage. Location: Noida/Gurgaon (Preference will be given to candidates open to hybrid working model.) Show more Show less
Posted 3 weeks ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY - GDS Consulting – GRC Technology – Enterprise GRC – Manager As EY GDS Consulting Manager, you’ll manage as well as contribute technically and functionally to GRC Technology client engagements and internal projects. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. The opportunity We’re looking for a seasoned Platforms engineer with experience on GRC/IRM modules, who will be responsible for the evaluation, recommendation and build out of customer requirements on the GRC platforms utilizing defined best practices for configuration and development, ensuring the scalability and performance of these GRC Platforms. This position requires a high level of functional GRC expertise with exposure and understanding of various GRC platforms (like Archer & ServiceNow), strategic thinking, and the ability to lead conversations and projects. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of the service offering. Your Key Responsibilities Lead requirement gathering and/or reviewing client's process requirements and how they map to GRC Platforms, driving platform expansion and adoption Scope solution requirements and configure solutions around the platform to meet customer needs and project deliverables Solution design to include integration of AI/Gen AI/Microservices for document/data/access management, 3rd party integrations, and cloud environment management and monitoring Understand solutions architecture design patterns and create solution architectures to client CIO/CTOs Define and implement IT architecture and data pipelines tailored for risk and GRC technologies Evaluate and select appropriate technologies and tools for data ingestion and normalization Consider dependencies, relationships, and integration points to ensure proper solution integration with other systems when applicable Leverage knowledge and experience to deliver end-to-end automated solutions which includes technical implementation of IT Infrastructure Library (ITIL) processes, workflow customization, process automation, report development, dashboard creation, and system configurations Identify new opportunities to provide additional value to clients and improve business performance Create/Review statements of work to help ensure appropriate level of effort Participate in growing and enhancing internal process for successful client delivery Stay updated with the latest trends and advancements in IT architecture, data engineering, and GRC technologies Ensure that all implementation work is carried out in accordance with the company policies and industry best practices. Lead business development activities including written proposals, presales activities, functional demonstrations and presentations. Ensuring adherence to quality processes specified for the project. Develop and maintain productive working relationships with client personnel. Planning and monitoring of the project deliverables from the team. Mentor the project team in executing the identified projects. Skills And Attributes For Success Conduct performance reviews and contribute to performance feedback for staff and senior staff. Foster teamwork, quality culture and lead by example. Understand and follow workplace policies and procedures. Training and mentoring of project resources. Participating in the organization-wide people initiatives To qualify for the role, you must have 7+ years industry experience, experience in leadership - leading larger teams. Should have led/completed at least 2 to 3 engagements/projects in a similar role. Demonstrated ability to map solutions to address client business issues and problem statements Functional knowledge and implementation experience of GRC frameworks working directly with customers and clients Experience in IT solution architecture and understanding of enterprise architecture.. Strong understanding of ITIL processes. Exceptional problem-solving capability and ability to think strategically. Excellent interpersonal and communication skills. Experience in strategy, business development, finance and budgeting desirable Good understanding of GRC technology platforms including Archer, ServiceNow. Ideally, you should also have B.E/B.Tech (Comp. Science, IT, Electronics, Electronics & Telecommunications)/MBA with a minimum of 7+ years of experience with other Big3 or panelled SI/ ITeS companies Robust understanding of program and project management practices Familiarity with a typical IT systems development life cycle Knowledge of industry standards and regulations related to risk and compliance. Knowledge and experience of GRC/IRM modules. Good to have experience in GRC roadmap review, vendor comparison and selection. Exposure to multiple GRC tools like MetricStream, Enablon, etc. would be an added advantage What We Look For A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment with consulting skills. An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
On-site
We are looking to hire a Data or Business Analyst to join our data team. You will take responsibility for managing our master data set, developing reports, and troubleshooting data issues. To do well in this role you need a very fine eye for detail, experience as a data analyst, and a deep understanding of the popular data analysis tools and databases. Responsibilities: Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develops reports and analyses. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing. Requirements: Bachelor’s degree from an accredited university or college in computer science. Work experience as a Data or Business Analyst or in a related field. Ability to work with stakeholders to assess potential risks. Ability to analyze existing tools and databases and provide software solution recommendations. Ability to translate business requirements into nontechnical, lay terms. High-level experience in methodologies and processes for managing large-scale databases. Demonstrated experience in handling large data sets and relational databases. Understanding of addressing and metadata standards. High-level written and verbal communication skills. Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana
On-site
- 2+ years of data scientist experience - 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience - Experience applying theoretical models in an applied environment Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for capacity planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications and research modeling to optimize operation processes. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, scientists, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. You will also improve the efficiency of capital investment by helping the fulfillment centers to improve storage utilization and the effective use of automation. Finally, you will help create the metrics to quantify improvements to the fulfillment costs (e.g., transportation and labor costs) resulting from the application of these optimization models and tools. Major responsibilities include: · Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. · Apply Statistical and Machine Learning methods to specific business problems and data. · Create global standard metrics across regions and perform benchmark analysis. · Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. · Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. · Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. · Develop efficient data querying and modeling infrastructure. · Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. · Utilizing code (Python, R, Scala, etc.) for analyzing data and building statistical models. Experience in Python, Perl, or another scripting language Experience in a ML or data scientist role with a large technology company Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 weeks ago
0 years
0 Lacs
Haryana, India
On-site
About The Role We are seeking a talented and detail-oriented Database Developer Performance Tuning to join our technology team. In this role, you will be responsible for designing, developing, optimizing, and maintaining SQL Server databases with a strong focus on performance tuning and T-SQL programming. Youll work closely with application developers, QA engineers, and system architects to ensure efficient and scalable database solutions. The ideal candidate is someone with a solid foundation in database design and performance optimization, who is also proactive in identifying potential issues and proposing innovative solutions. Key Responsibilities Develop, enhance, and maintain database objects including stored procedures, functions, views, triggers, and scripts using T-SQL. Design and implement scalable database solutions that align with business and technical requirements. Collaborate with software development teams to integrate efficient database queries into applications. Analyze and troubleshoot SQL Server performance issues including slow-running queries, blocking, deadlocks, and index fragmentation. Perform index tuning, query optimization, and execution plan analysis. Monitor and fine-tune SQL Server environments for high performance and availability. Design and develop SSRS reports, dashboards, and data visualizations based on business requirements. Prepare technical documentation including data models, specifications, and functional flow diagrams. Maintain version control and change management using tools like Team Foundation Server (TFS) or Git. Perform routine database administration tasks including backups, security, patching, and schema migrations. Evaluate existing systems and recommend improvements for stability, scalability, and efficiency. Provide technical support to development and QA teams, and assist in debugging and fixing : Technical Skills Proficient in Microsoft SQL Server and T-SQL for database development and performance tuning. Strong understanding of relational database design, normalization, and data integrity principles. Hands-on experience with SSRS (SQL Server Reporting Services) for report generation and customization. Experience with Team Foundation Server (TFS) or equivalent version control and issue tracking tools. Familiarity with cloud-based databases (Azure SQL, AWS RDS) is a plus but not mandatory. Understanding of application integration techniques and agile software development (Scrum) is desirable (ref:hirist.tech) Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
As Lead Splunk, Your Role And Responsibilities Would Include Hands-on experience in the SIEM domain Deep understanding of Splunk backend operations (UF, HF, SH, and Indexer Cluster) and architecture Strong knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expertise in optimizing logs and license usage. Solid understanding of designing, deploying, and implementing scalable SIEM architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integrating Splunk logging infrastructure with third-party observability tools like ELK and DataDog. Experience in identifying the security and non-security logs and applying appropriate filters to route the logs correctly. Expertise in understanding network architecture and identifying the components of impact. Proficiency in Linux administration. Experience with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash for task automation. Expertise with OEM SIEM tools, preferably Splunk. Experience with open-source SIEM/log storage solutions like ELK or Datadog. Strong documentation skills for creating high-level design (HLD), low-level design (LLD), implementation guides, and operation manuals. Skills: siem,linux administration,team collaboration,communication skills,architecture design,python,parsing,normalization,retention practices,powershell,data security,log management,bash,splunk,log collection,documentation,syslog,incident response,data analysis Show more Show less
Posted 3 weeks ago
3.0 - 5.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Role : Data, Analytics & Insights Managed Service-Testing Engineer-Associate Tower : Testing Managed Service Experience : 3 - 5.5 years Key Skills: Data Analytics, SQL, Python, Statistical Analysis, ETL, Reporting Educational Qualification : Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description At an Associate level, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Flexible to work in stretch opportunities/assignments. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: Data Analytics concepts, SQL, ETL concepts, Python, Cloud platform (AWS/AZURE), JAVA core concepts, GITLAB Secondary Skill: Reporting, API testing, Salesforce Workbench, Airflow/prefect Data Analyst Should have minimum 3 year’s hand on experience building advanced Data Analytics Should have minimum 3 years’ hands on Experience ETL testing with strong emphasis on accuracy, reliability and data integrity. Should have minimum 3 years’ hands on Experience in SQL for Data Handling, Minimum 0-1 years of experience in Manual Testing Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data for downstream consumption like Analytics modeling and Testing Data accuracy. Should have understanding on core data concepts: Data ingestion, transformation, enrichment, quality, governance, lineage and design Validating data fields, formats and utility using Quality Control checklists Should have sound knowledge of snowflake data warehouse and Salesforce workbench for data management and reconciliation. Ability to transform requirements and user stories into comprehensive test cases. Strong foundation in data modelling principles, including database schemas, normalization and relationships. Ability to define and develop test strategies, test plans, KPI’s, Metrics, Defect management processes and reporting mechanism. Designing Test Cases, generate test data based on the requirements and project needs. Familiarity with data quality frameworks and tools to enhance testing efficiency. In-depth understanding of industry standards and best practices Exposure to Data Quality Platform. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in ETL and other BI tools is an added advantage. Tools knowledge: - Snowflakes, Qlik, Appflow, Data Gaps, Query Surge, iceDQ, Ataccama or equivalent Cloud platforms: AWS,AZURE Soft Skills Should have Exceptional written and verbal communication skills to articulate technical details effectively Strong problem-solving skills with keen attention to detail and precision. Ability to work autonomously as well as collaborate effectively within diverse teams. A proactive mindset focused on continuous improvement in quality assurance processes. Adaptability manage multiple priorities and excel in dynamic, fast-paced environments. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Engineer Location: Chennai (Hybrid) Summary Design, develop, and maintain scalable data pipelines and systems to support the collection, integration, and analysis of healthcare and enterprise data. The primary responsibilities of this role include designing and implementing efficient data pipelines, architecting robust data models, and adhering to data management best practices. In this position, you will play a crucial part in transforming raw data into meaningful insights, through development of semantic data layers, enabling data-driven decision-making across the organization. The ideal candidate will possess strong technical skills, a keen understanding of data architecture, and a passion for optimizing data processes. Accountability Design and implement scalable and efficient data pipelines to acquire, transform, and integrate data from various sources, such as electronic health records (EHR), medical devices, claims data, and back-office enterprise data Develop data ingestion processes, including data extraction, cleansing, and validation, ensuring data quality and integrity throughout the pipeline Collaborate with cross-functional teams, including subject matter experts, analysts, and engineers, to define data requirements and ensure data pipelines meet the needs of data-driven initiatives Design and implement data integration strategies to merge disparate datasets, enabling comprehensive and holistic analysis Implement data governance practices and ensure compliance with healthcare data standards, regulations (e.g., HIPAA), and security protocols Monitor and troubleshoot pipeline and data model performance, identifying and addressing bottlenecks, and ensuring optimal system performance and data availability Design and implement data models that align with domain requirements, ensuring efficient data storage, retrieval, and delivery Apply data modeling best practices and standards to ensure consistency, scalability, and reusability of data models Implement data quality checks and validation processes to ensure the accuracy, completeness, and consistency of healthcare data Develop and enforce data governance policies and procedures, including data lineage, architecture, and metadata management Collaborate with stakeholders to define data quality metrics and establish data quality improvement initiatives Document data engineering processes, methodologies, and data flows for knowledge sharing and future reference Stay up to date with emerging technologies, industry trends, and healthcare data standards to drive innovation and ensure compliance Skills 4+ years strong programming skills in object-oriented languages such as Python Proficiency in SQL Hands on experience with data integration tools, ETL/ELT frameworks, and data warehousing concepts Hands on experience with data modeling and schema design, including concepts such as star schema, snowflake schema and data normalization Familiarity with healthcare data standards (e.g., HL7, FHIR), electronic health records (EHR), medical coding systems (e.g., ICD-10, SNOMED CT), and relevant healthcare regulations (e.g., HIPAA) Hands on experience with big data processing frameworks such as Apache Hadoop, Apache Spark, etc. Working knowledge of cloud computing platforms (e.g., AWS, Azure, GCP) and related services (e.g., DMS, S3, Redshift, BigQuery) Experience integrating heterogeneous data sources, aligning data models and mapping between different data schemas Understanding of metadata management principles and tools for capturing, storing, and managing metadata associated with data models and semantic data layers Ability to track the flow of data and its transformations across data models, ensuring transparency and traceability Understanding of data governance principles, data quality management, and data security best practices Strong problem-solving and analytical skills with the ability to work with complex datasets and data integration challenges Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams Education Bachelor's or Master's degree in computer science, information systems, or a related field. Proven experience as a Data Engineer or similar role with a focus on healthcare data. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain o Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture o Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. o Expert in Logs/License optimization techniques and strategy. o Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture . o Understanding of data parsimony as a concept, especially in terms of German data security standards. o Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) o Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. o Expert in understanding the Network Architecture a nd identifying the components of impact. o Expert in Linux Administration. o Proficient in working with Syslog . o Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk E xperience with open source SIEM/Log storage solutions like ELK OR Datadog etc. . o Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Show more Show less
Posted 3 weeks ago
7.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Lead Splunk Engineer Location: Gurgaon (Hybrid) Experience: 7-10 Years Employment Type: Full-time Notice Period: Immediate Joiners Preferred Job Summary: We are seeking an experienced Lead Splunk Engineer to design, deploy, and optimize SIEM solutions with expertise in Splunk architecture, log management, and security event monitoring . The ideal candidate will have hands-on experience in Linux administration, scripting, and integrating Splunk with tools like ELK & DataDog . Key Responsibilities: ✔ Design & deploy scalable Splunk SIEM solutions (UF, HF, SH, Indexer Clusters). ✔ Optimize log collection, parsing, normalization, and retention . ✔ Ensure license & log optimization for cost efficiency. ✔ Integrate Splunk with 3rd-party tools (ELK, DataDog, etc.) . ✔ Develop automation scripts (Python/Bash/PowerShell) . ✔ Create technical documentation (HLD, LLD, Runbooks) . Skills Required: 🔹 Expert in Splunk (Architecture, Deployment, Troubleshooting) 🔹 Strong SIEM & Log Management Knowledge 🔹 Linux/Unix Administration 🔹 Scripting (Python, Bash, PowerShell) 🔹 Experience with ELK/DataDog 🔹 Understanding of German Data Security Standards (GDPR/Data Parsimony) Why Join Us? Opportunity to work with cutting-edge security tools . Hybrid work model (Gurgaon-based). Collaborative & growth-oriented environment . Show more Show less
Posted 3 weeks ago
0.0 years
0 Lacs
Bhopal, Madhya Pradesh
On-site
Position: Python Intern specializing in AI Location : Bhopal, Madhya Pradesh (Work from Office) Duration: 3 to 6 Months ✅ Must-Have Skills Core Python Programming: Functions, loops, list comprehensions, classes Error handling (try-except), logging File I/O operations, working with JSON/CSV Python Libraries for AI/ML: numpy, pandas – Data manipulation & analysis matplotlib, seaborn – Data visualization scikit-learn – Classical machine learning models Basic familiarity with tensorflow or pytorch Working knowledge of Openai / Transformers (bonus) AI/ML Fundamentals: Supervised and unsupervised learning (e.g., regression, classification, clustering) Concepts of overfitting, underfitting, and bias-variance tradeoff Train-test split, cross-validation Evaluation metrics: accuracy, precision, recall, F1-score, confusion matrix Data Preprocessing: Handling missing data, outliers Data normalization, encoding techniques Feature selection & dimensionality reduction (e.g., PCA) Jupyter Notebook Proficiency: Writing clean, well-documented notebooks Using markdown for explanations and visualizing outputs Version Control: Git basics (clone, commit, push, pull) Using GitHub/GitLab for code collaboration ✅ Good-to-Have Skills Deep Learning: Basic understanding of CNNs, RNNs, transformers Familiarity with keras or torch.nn for model building Generative AI: Prompt engineering, working with LLM APIs like OpenAI or Hugging Face Experience with vector databases (Qdrant, FAISS) NLP: Tokenization, stemming, lemmatization TF-IDF, Word2Vec, BERT basics Projects in sentiment analysis or text classification Tools & Platforms: VS Code, JupyterLab Google Colab / Kaggle Docker (basic understanding) Math for AI: Linear algebra, probability & statistics Basic understanding of gradients and calculus ✅ Soft Skills & Project Experience Participation in mini-projects (e.g., spam detector, digit recognizer) Kaggle competition experience Ability to clearly explain model outputs and results Documenting findings and creating simple dashboards or reports. Job Types: Full-time, Internship Pay: From ₹5,000.00 per month Schedule: Day shift Work Location: In person
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Experience: 7+ to 10 Yrs Notice Period: Immediate joiners Work Timings: Normal working Hours Location: Gurgaon, Work from office -Hybrid mode, client location As Lead Splunk, Your Role And Responsibilities Would Include Hands on experience in the SIEM domain Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expert in Logs/License optimization techniques and strategy. Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. Expert in understanding the Network Architecture and identifying the components of impact. Expert in Linux Administration. Proficient in working with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk E xperience with open source SIEM/Log storage solutions like ELK OR Datadog etc. . Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Skills: integration with 3rd party tools,python,log management,logs optimization,documentation,security,siem architecture design,parsing,oem siem tools,linux administration,normalization,log collection,syslog,powershell,bash,security logs identification,siem,retention practices,data parsimony,splunk Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Senior Engineer, Data Modeling Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What We OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Flex is the diversified manufacturing partner of choice that helps market-leading brands design, build and deliver innovative products that improve the world. We believe in the power of diversity and inclusion and cultivate a workplace culture of belonging that views uniqueness as a competitive edge and builds a community that enables our people to push the limits of innovation to make great products that create value and improve people's lives. A career at Flex offers the opportunity to make a difference and invest in your growth in a respectful, inclusive, and collaborative environment. If you are excited about a role but don't meet every bullet point, we encourage you to apply and join us to create the extraordinary. To support our extraordinary teams who build great products and contribute to our growth, we’re looking to add a Specialist – Planning in Chennai , India. A professional who can quickly and accurately process purchase orders in a fast-paced environment. Has excellent stake holders service skills and works well in a team to consistently meet challenging performance targets. What a typical day looks like: Responsible for providing expertise and support to the Customer Focus Team (CFT), ensuring the ability of the materials planning for a specific project or projects as required. Providing materials support to the weekly production planned orders and enables to achieve Kit on time drop to meet Customer Schedule Key assignments includes providing timely Materials status through use of available Shortage reports, Submission of Excess and Obsolete Inventory to the Customer, Work Order Management, inventory management to achieve the operating goals. Senior Materials Planners for New Emerging NPI Accounts to provide faster service to the NPI Customer to effectively communicate with the customer protecting Business interest of Flex Working on customer forecast for activity like normalization, forecast comparison etc. Working on customer forecast & shipment using waterfall method. Responsible for analyzing availability of materials & capacity based on customer demand & coming up with aggressive but achievable loading schedule. Responsible for running weekly system reports to determine material shortages & work on their closure with buying team. Responsible for handling work order management based on build plan. Responsible for identifying & taking various inventory management measures. The experience we’re looking to add to our team: Education: Bachelor’s Degree or Engineering Graduates Experience: 3-5 yr. Planning/ Supply Chain Mandatory Knowledge of computer software applications, MS Excel, Word & PowerPoint (PF) Proficiency: ERP/P2P systems BAAN / SAP/ Oracle / Kinaxis / Pulse Knowledge of Engineering BOMs, product structure, EOL, ECO Management Knowledge of complete planning cycle including MPS, MRP, Demand Planning, Materials planning, Production planning. Communication: Communication, both verbal and written, is an important part of this role. The job holder is required to exchange information, ideas and views on business related matters concerning the Planning function, throughout the Company at all levels. Innovation: The jobholder is required to show a willingness to question traditional methodology and make recommendations on new ways of approaching problems and improving existing processes. Here are a few examples of what you will get for the great work you provide: Health Insurance PTO #RA01 Site Flex is an Equal Opportunity Employer and employment selection decisions are based on merit, qualifications, and abilities. We celebrate diversity and do not discriminate based on: age, race, religion, color, sex, national origin, marital status, sexual orientation, gender identity, veteran status, disability, pregnancy status, or any other status protected by law. We're happy to provide reasonable accommodations to those with a disability for assistance in the application process. Please email accessibility@flex.com and we'll discuss your specific situation and next steps (NOTE: this email does not accept or consider resumes or applications. This is only for disability assistance. To be considered for a position at Flex, you must complete the application process first). Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
TCS HIRING!! ROLE: AWS Data architect LOCATION: HYDERABAD YEAR OF EXP: 8 + YEARS Data Architect: Must have: Relational SQL/ Caching expertise – Deep knowledge of Amazon Aurora PostgreSQL, ElastiCache etc.. Data modeling – Experience in OLTP and OLAP schemas, normalization, denormalization, indexing, and partitioning. Schema design & migration – Defining best practices for schema evolution when migrating from SQL Server to PostgreSQL. Data governance – Designing data lifecycle policies, archival strategies, and regulatory compliance frameworks. AWS Glue & AWS DMS – Leading data migration strategies to Aurora PostgreSQL. ETL & Data Pipelines – Expertise in Extract, Transform, Load (ETL) workflows . Glue jobs features and event-driven architectures. Data transformation & mapping – PostgreSQL PL/pgSQL migration / transformation expertise while ensuring data integrity. Cross-platform data integration – Connecting cloud and on-premises / other cloud data sources. AWS Data Services – Strong experience in S3, Glue, Lambda, Redshift, Athena, and Kinesis. Infrastructure as Code (IaC) – Using Terraform, CloudFormation, or AWS CDK for database provisioning. Security & Compliance – Implementing IAM, encryption (AWS KMS), access control policies, and compliance frameworks (eg. GDPR ,PII). Query tuning & indexing strategies – Optimizing queries for high performance. Capacity planning & scaling – Ensuring high availability, failover mechanisms, and auto-scaling strategies. Data partitioning & storage optimization – Designing cost-efficient hot/cold data storage policies. Should have experience with setting up the AWS architecture as per the project requirements Good to have: Data Warehousing – Expertise in Amazon Redshift, Snowflake, or BigQuery. Big Data Processing – Familiarity with Apache Spark, EMR, Hadoop, or Kinesis. Data Lakes & Analytics – Experience in AWS Lake Formation, Glue Catalog, and Athena. Machine Learning Pipelines – Understanding of SageMaker, BedRock etc. for AI-driven analytics. CI/CD for Data Pipelines – Knowledge of AWS CodePipeline, Jenkins, or GitHub Actions. Serverless Data Architectures – Experience with event-driven systems (SNS, SQS, Step Functions). Show more Show less
Posted 3 weeks ago
1.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
The impact that you will be making The role would require you to develop sophisticated products that add value to the client and result in new projects and revenue streams. What this role entail Design, develop, and optimize SQL databases, tables, views, and stored procedures to meet business requirements and performance goals. Write efficient and high-performing SQL queries to retrieve, manipulate, and analyze data. Ensure data integrity, accuracy, and security through regular monitoring, backups, and data cleansing activities. Identify and resolve database performance bottlenecks, optimizing queries and database configurations. Investigate and resolve database-related issues, including errors, connectivity problems, and data inconsistencies. Collaborate with cross-functional teams, including Data Analysts, Software Developers, and Business Analysts, to support data-driven decision-making. Maintain comprehensive documentation of database schemas, processes, and procedures. Implement and maintain security measures to protect sensitive data and ensure compliance with data protection regulations. Assist in planning and executing database upgrades and migrations. What lands you in the role 1- 3 years of relevant work experience as a SQL Developer or in a similar role. Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Proficiency in SQL, including T-SQL for Microsoft SQL Server or PL/SQL for Oracle. Strong knowledge of database design principles, normalization, and indexing. Experience with database performance tuning and optimization techniques. Familiarity with data modeling tools and techniques. Understanding of data warehousing concepts is a plus. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Ability to work independently and manage multiple tasks simultaneously. Certifications in database management (e.g., Microsoft Certified: Azure Database Administrator Associate) is a plus. What we offer An opportunity to be part of some of the best enterprise SaaS products to be built out of India. Opportunities to quench your thirst for problem-solving, experimenting, learning, and implementing innovative solutions. A flat, collegial work environment, with a work hard, play hard attitude. A platform for rapid growth if you are willing to try new things without fear of failure. Remuneration with best-in-class industry standards with generous health insurance cover About Impact Analytics Impact Analytics™ ( Series D Funded ) delivers AI-native SaaS solutions and consulting services that help companies maximize profitability and customer satisfaction through deeper data insights and predictive analytics. With a fully integrated, end-to-end platform for planning, forecasting, merchandising, pricing, and promotions, Impact Analytics empowers companies to make smarter decisions based on real-time insights rather than relying on last year’s inputs to forecast and plan this year’s business. Powered by over one million machine learning models, Impact Analytics has been leading AI innovation for a decade, setting new benchmarks in forecasting, planning, and operational excellence across the retail, grocery, manufacturing, and CPG sectors. In 2025, Impact Analytics is at the forefront of the Agentic AI revolution, delivering autonomous solutions that enable businesses to adapt in real time, optimize operations, and drive profitability without manual intervention . Here’s a link to our website: www.impactanalytics.co . Some of our accolades include: Ranked as one of America's Fastest-Growing Companies by Financial Times for five consecutive years: 2020-2024. Ranked as one of America's Fastest-Growing Private Companies by Inc. 5000 for seven consecutive years: 2018-2024. Voted #1 by more than 300 retailers worldwide in the RIS Software LeaderBoard 2024 report. Ranked #72 in America’s Most Innovative Companies list in 2023 —by Fortune —alongside companies like Microsoft, Tesla, Apple, IBM, etc. Forged a strategic partnership with Google to equip retailers with cutting-edge generative AI tools. Recognized in multiple Gartner reports , including Market Guides and Hype Cycle , spanning assortments, merchandising, forecasting, algorithmic retailing, and Unified Price, Promotion, and Markdown Optimization Applications. Show more Show less
Posted 3 weeks ago
0.0 - 2.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Position: Web Developer We are looking for a highly skilled Web Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers. Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹30,000.00 - ₹40,000.00 per month Education: Bachelor's (Preferred) Experience: Core PHP: 2 years (Required) Laravel: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person
Posted 3 weeks ago
0.0 - 2.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
modelling We are looking for a highly skilled Sr. Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Experience with Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Location Type: In-person Schedule: Day shift Morning shift Education: Bachelor's (Required) Experience: PHP: 1 year (Required) Laravel: 1 year (Required) Total: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking an experienced MySQL Database Administrator (DBA) to join our tech team in Hyderabad. The ideal candidate will be responsible for managing high-performance MySQL environments, with a strong focus on schema design, clustering, replication, and high availability. Expertise in MySQL schema design is critical to ensure data integrity, optimal performance, and future scalability. The role requires a deep understanding of indexing strategies, normalization principles, and advanced query optimization techniques to maintain fast and efficient database responses. Additional responsibilities include implementing partitioning strategies and collaborating closely with our development and DevOps teams to ensure our systems are scalable, secure, and highly available. Key Responsibilities • Design, deploy, and maintain MySQL databases with clustering and replication. • Configure and manage MySQL clusters (e.g., InnoDB Cluster, Galera Cluster) for high availability. • Monitor performance and proactively resolve bottlenecks or issues. • Set up and manage automated backups, failover, and disaster recovery plans. • Optimize SQL queries, indexes, and schema designs for performance. • Collaborate with developers to support efficient data structures and queries. • Ensure database security, data integrity, and compliance with best practices. • Maintain clear documentation of configurations, procedures, and topologies. Required Qualifications • Bachelor’s degree in Computer Science, Information Technology, or a related field. • 3+ years of hands-on experience as a MySQL DBA, including clustering (InnoDB or Galera). • Deep understanding of MySQL replication (master-slave, master-master). • Strong SQL skills and experience with stored procedures, triggers, and indexing. • Experience with performance tuning and query optimization. • Familiarity with Linux environments and scripting (Bash or Python). • Experience with cloud-hosted MySQL (AWS RDS, Azure Database for MySQL, etc.). Preferred Skills • Experience with monitoring tools (e.g., Percona Toolkit, PMM, Nagios, Zabbix). • Familiarity with Laravel or Node.js backend integrations. • Exposure to DevOps tools like Ansible, Docker, or Kubernetes. • Understanding of NoSQL or caching solutions (e.g., Redis, Memcached). Why Join Us in Hyderabad? • Work on exciting projects with modern, scalable infrastructure. • Join a collaborative and innovative tech culture. • Enjoy a flexible, growth-focused work environment. • Competitive salary and growth opportunities in a fast-growing organization. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Jaipur, Rajasthan, India
Remote
Project Role : Infra Tech Support Practitioner Project Role Description : Provide ongoing technical support and maintenance of production and development systems and software products (both remote and onsite) and for configured services running on various platforms (operating within a defined operating model and processes). Provide hardware/software support and implement technology at the operating system-level across all server and network areas, and for particular software solutions/vendors/brands. Work includes L1 and L2/ basic and intermediate level troubleshooting. Must have skills : Kubernetes Good to have skills : Microsoft Azure Container Infrastructure, DevOps Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Infra Tech Support Practitioner, you will provide ongoing technical support and maintenance for production and development systems and software products. You will be responsible for configuring services on various platforms and implementing technology at the operating system-level. Your role will involve troubleshooting at both basic and intermediate levels, ensuring smooth operations and resolving any issues that arise. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Ensure smooth operations and maintenance of production and development systems. - Configure services on various platforms. - Implement technology at the operating system-level. - Troubleshoot issues at both basic and intermediate levels. Professional & Technical Skills: - Must To Have Skills: Proficiency in Kubernetes. - Good To Have Skills: Experience with DevOps, Microsoft Azure Container Infrastructure. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 5 years of experience in Kubernetes. - This position is based at our Gurugram office. - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 3 weeks ago
2.0 - 4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Senior Analyst Roles & Responsibilities Responsible for managing multiple Middle Office and Regulatory support processes Independently managing middle / back-office operations for Investment Banks Preferable relevant work experience in a similar domain (Swap Data Repository Reconciliation) and team management skills Build domain expertise across processes supported, assist in speedy remediation of issues, implement process improvements and build/enhance controls to prevent future escalations Participate in Middle Office/group level initiatives Responsible for ensuring Change Management and Process documentation is maintained in an updated fashion at all times Review and analyze trade data between risk and finance systems Investigate the genuine breaks for root cause and facilitate resolution and decision support, wherever necessary Provide support on the change and new business requests received from various RFDAR/non-RFDAR teams by assessing the business requirements, performing testing, providing SME support Apply data normalization methods such as filtering, standardization, enrichment, aggregation Create reports/metrics/analysis to cover the Daily / Weekly / Monthly requests Mailbox Management / Queue Management Build domain expertise Functional & Technical Skills Bachelor's Degree in B. Com /BBM or Master’s Degree in M. Com/MBA/PGDM 2 to 4 years of experience in Confirmations, Portfolio management, Settlements or equity Should have basic knowledge of finance, trade life cycle, investment banking, and derivatives High levels of energy enthusiasm, commitment and productivity, Proactive, effective influencer, result-oriented Should be good with logical and quantitative abilities to derive information from data Time management and ability to resolve issues speedily Above average in planning, organizing and time management About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less
Posted 3 weeks ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
As a Staff Software Engineer- Business intelligence with expertise in business intelligence field, utilizing Tableau Desktop, Tableau Server, Tableau Prep and Tableau Cloud you will be responsible to oversee the development and utilization of data systems. You will be reporting to the Sr. Manager – Data Engineering, to join our dynamic team in the Foreign Exchange payments processing industry. This role focuses on data analytics and business intelligence, and is responsible for architecting, designing, developing, and maintaining interactive and insightful data visualizations using Tableau. This role requires a deep understanding of business processes, technology, data management, and regulatory compliance. The successful candidate will work closely with business and IT leaders to ensure that the enterprise data architecture supports business goals, and that data governance policies and standards are adhered to across the organization. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. It also includes building a new data platform, integrating data from various sources, and ensuring data availability for various application and reporting needs. Additionally, the candidate should have experience working with AI/ML technologies and collaborating with data scientists to meet their data requirements.. In your role as a Staff Software Engineer- Business Intelligence, you will: Define and evolve the technical vision and architecture for analytics, BI, and reporting across different data models and visualizations that support use cases across different products or domains Drive the roadmap for BI modernization with a focus on scalability, performance, and selfservice capabilities. Collaborate with business stakeholders to understand their data visualization needs and business requirements and translate them into effective dashboards and reports. Architect, design, develop, and maintain visually appealing, interactive dashboards that ensure they provide meaningful and actionable insights. Design, build, and launch collections of sophisticated data models and visualizations that support use cases across different products or domains Manage workbook publishing and data refresh schedules to ensure data is updated and relevant for stakeholders. Optimize Tableau dashboards for optimal performance, including improvements in load times and responsiveness, as well as effective handling of large datasets. Develop, monitor, and maintain ETL processes for automated, scheduled data refreshes, collaborating with team members to resolve issues related to source system extracts, mapping, and data loading in a timely and efficient manner. Cleanse and prepare data for analysis and reporting to ensure data accuracy and integrity, and relevance. Conduct training sessions and support end-users in the use and interpretation of data visualizations. A successful candidate for this position should have Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, Business Analytics or related IT discipline; or equivalent experience 12+ years of experience in the business intelligence field, utilizing Tableau Desktop, Tableau Server, Tableau Prep and Tableau Cloud Hands on experience in data engineering, BI, or analytics, including at least 5+ years in a lead or principal role Extensive experience as a Tableau developer Extensive experience with MS SQL Server and relational database management systems, including data modeling and normalization Proficient in writing complex statements, scripts, stored procedures, triggers, and views to implement business intelligence and data warehousing solutions Experience working with ETL Ability to take ownership of projects and independently drive them from concept to completion, delivering high-quality results on time. Strong understanding of data visualization principles Excellent analytical and problem-solving skills Ability to communicate complex data insights to non-technical stakeholders Experience in performance tuning and optimization of Tableau solutions Expertise in Cloud platforms (e.g., Google Cloud Platform, AWS, or Azure) Experience with Visualization tools such as Power BI Professional certification in Tableau Candidates with banking domain experience will be highly preferred. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers – helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, results-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual bonus. Great career growth and development opportunities in a global organization A flexible approach to work There are plenty of amazing opportunities at Convera for talented, creative problem solvers who never settle for good enough and are looking to transform Business to Business payments. Apply now if you’re ready to unleash your potential. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities - Experience in designing and implementing the ELT architecture to build data warehouse including source-to-staging, staging-to-target mapping design Experience in Configuring Master Repository, Work Repository, Projects, Models, Sources, Targets, Packages, Knowledge Modules, Mappings, Scenarios, Load plans, and Metadata. Experience in creating database connections, physical and logical schema using the Topology Manager Experience in creation of packages, construction of data warehouse and data marts, and synchronization using ODI Experience in architecting data-related solutions, developing data warehouses, developing ELT/ETL jobs, Performance tuning and identifying bottlenecks in the process flow. Experience using Dimensional Data modeling, Star Schema modeling, Snow-Flake modeling, - Experience using Normalization, Fact and Dimensions Tables, Physical and Logical Data Modeling. Having Good Knowledge in Oracle cloud services and Database options. Strong Oracle SQL expertise using tools such as SQL Developer - Understanding ERP modules is good to have Mandatory Skill Sets ODI, OAC Preferred Skill Sets ODI, OAC Years Of Experience Required 8 - 10 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Data Integrator (ODI) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description We are seeking a skilled SQL Developer with a strong background in writing and optimizing Stored Procedures (SPS) to join our data team. The ideal candidate will be responsible for developing, maintaining, and fine-tuning SQL queries and procedures to support business applications and analytics functions. Key Responsibilities Design, develop, and maintain complex SQL queries, stored procedures, functions, and triggers. Optimize existing SQL code for performance, scalability, and reliability. Collaborate with application developers, data analysts, and business stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and security across various systems. Perform ETL (Extract, Transform, Load) tasks to support data integration and migration efforts. Troubleshoot and resolve database-related issues, including performance bottlenecks. Document database structures, procedures, and development processes. Required Qualifications 4+ years of hands-on experience in SQL development. Strong expertise in writing and optimizing Stored Procedures in SQL Server, Oracle, or MySQL. Proficient in database design, normalization, and indexing strategies. Experience in performance tuning and query optimization techniques. Familiarity with ETL tools and data integration best practices. Skills Sql,Stored procedure Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : SQL,PLSQL,SQL Writing,mSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner Job Description: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a SQL Developer Intern to join our team remotely. As an intern, you will work with our database team to design, optimize, and maintain databases while gaining hands-on experience in SQL development. This is a great opportunity for someone eager to build a strong foundation in database management and data analysis. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for normalization roles in India is growing rapidly as more companies recognize the importance of data quality and consistency. Normalization jobs involve organizing and structuring data to eliminate redundancy and improve efficiency in database management. If you are considering a career in normalization, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving IT sectors and have a high demand for normalization professionals.
The average salary range for normalization professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10 lakhs per annum.
A typical career path in normalization may involve starting as a Data Analyst, progressing to a Database Administrator, and eventually becoming a Data Architect or Database Manager. With experience and additional certifications, professionals can move into roles such as Data Scientist or Business Intelligence Analyst.
In addition to normalization skills, professionals in this field are often expected to have knowledge of database management systems, SQL, data modeling, data warehousing, and data analysis.
As you prepare for interviews and explore job opportunities in the field of normalization, remember to showcase your expertise in database management and data structuring. With the right skills and knowledge, you can excel in this dynamic and growing field in India. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2