Established in 1999, Jigya is an international Minority Business Enterprise (MBE) consulting company with offices in Georgia, North Carolina, California and India. Jigya’s founders know the industry inside and out – each has over 10 years of experience working with large corporations on both the consulting and buying end. Our vast experience comes from working across industries – retail, banking, manufacturing, pharmaceutical, healthcare, and more. With over 200 IT Consultants on staff and $25 million in annual revenue, Jigya has the capacity to meet any of your business needs.
Not specified
INR 9.0 - 14.0 Lacs P.A.
Work from Office
Full Time
Title: Cloud Native Security Consultant Mode of working Full Time Work Location - Bangalore, India. Work Experience 8-12 years Description of Role: The Security Consultant should have a strong understanding of the emerging security practices and standards. Should be able to consult, engineer and apply security best practices while designing and proposing solutions to our enterprise customers. Should be able to conduct system security, vulnerability analysis and risk assessment, identify security gaps, identify integration issues, study architecture/platform and design security architecture. A Cloud Native Security Consultant undertakes complex work of a high-risk level, often working on several projects. In this role, you will: Interact with senior stakeholders across departments Reach and influence a wide range of people across larger teams and communities Research and apply innovative security architecture solutions to new or existing problems and be able to justify and communicate design decisions Develop vision, principles, and strategy for security architects for one project or technology Work out subtle security needs Understand the impact of decisions, balancing requirements and deciding between approaches Produce patterns and support quality assurance Be the point of escalation for architects in lower-grade roles Lead the technical design of systems and services Qualifications/Experience: Bachelors degree in any stream. Minimum 3 years of working experience in Cyber Security Consulting or Advisory. Successfully delivered at-least 2 (two) Cyber Security consulting and implementation projects as consultant in recent years (2 years). Certification: Preferred Certification: GIAC Cloud Security Automation (GCSA) Certified Kubernetes Security Specialist (CKS) Certified DevSecOps Professional (CDP) KUBERNETES AND CLOUD NATIVE ASSOCIATE (KCNA) OEM certification on CNAPP security products (e.g., Palo Alto Prisma, Checkpoint Cloudguard, Aqua Security etc.) Cloud Service Provider Security Certificates (e.g., SC-100, AWS Certified Security- Specialty, GCP Professional Cloud Security Engineer) Pen Tester certification (LPT/ OSCP/GPEN) Certified Security - Specialty *Certification should be valid Responsibilities Below will be the scope of the role Collaborate with teams to build deliver solutions implementing serverless, microservices based, IaaS, PaaS and containerized architecture of multi cloud environment Develop rule base and parameterized IaC templates for automated deployment using Terraform Build CI/CD Pipeline using AWS (CodeBuild, CodeDeploy, CodePipeline), Google (Cloud Build), Azure (DevOps, Pipelines) Integrate 3rd party tool with CICD Process (e.g. SonarQube, CheckMarx, Embold) Config Manage environment using industry standard DevOps tools (Ansible) Implement scripting to extend build\deployment\monitoring process (PowerShell, Bash, Python) Ability to develop IaC with Terraform Strong understanding on Cloud Networking Container, Microservices, Docker, Kubernetes security. Network Security Orchestration on Microservices environment Secure Microservice Communication, Secure Authentication to Common DB without API/password/sharing keys Technical documentation, Product evaluation, POC. Implementation, Migration and Architect of Security Technology and Solution DevOps, DevSecOps and SRE (site reliability Engineering) mindset Knowledge and Skills Candidate should have experience in the below domains Hands on experience with Cloud Native Application protection CNAPP Tools (Prisma Cloud by Palo Alto, Checkpoint Cloud Guard, Aqua Security) Hands on experience with Automation Tools (e.g., Ansible, Chef, Puppet) Experienced with Application migration from Monolithic to Microservices Architecture Web Application Firewall implementation experience at Kubernetes and API Gateway Experience with implementation of Vulnerability scanner and Container Image repository hardening Well depth understanding on AWS, AZURE, GCP offered services (EKS, AKS, GKE) Understanding and review of Infrastructure as Code (IaC), Compliance as Code (CaC) Updated with trends and participation of industry recognized forum (e.g., Cloud Native Computing Foundation) Experienced with deliverables on Cloud Security Posture Management, Cloud Workload Protection, Cloud Infra Entitle Management, Serverless Security Application Security testing for Web and Mobile as SAST/DAST/IAST approach (Fortify, Veracode, Burp Suite) Secure Code review, Open-Source validation (Gitlab, Coverity, SonarQube, Black Duck) Well versed with OWASP Top10 and SANS top 25 Vulnerabilities and remediation Well understanding on PTES (Penetration Testing Execution Standard) and Testing. Well understanding Software Security Framework (e.g., BSIMM, SAMM) Good written verbal communication and analytical skills. Good documentation skills. Good problem-solving skills. ",
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Title: IAM Security Consultant Mode of working Full Time Work Location - Bangalore, India. Work Experience 8-12 years Description of Role: The Security Consultant should have a strong understanding of the emerging security practices and standards. Should be able to consult, engineer and apply security best practices while designing and proposing solutions to our enterprise customers. The IAM Okta Engineer is responsible for being the working on the design, installation, integrations, and deployment of Okta products and modules (i.e., MFA, SSO, UD, LM, API AM, Microservices etc.). The IAM Okta Engineer is responsible to collaborate with Customer/partners both internal and external on a CIAM solution; working with the teams on implementation and support of the Identity and Access management solution; supporting on implementation of the Identity Governance solution (IGA); creating Cloud based profiles for PING accounts, and supporting microservice account syncing; architecture associate Artifacts, including best practice, playbook; contributing to support repeatable processes for Access Management using PING and Okta; and assisting application teams through the SDLC process (including requirements gathering, configuration, testing to integrate applications with Okta), The IAM Engineer should be able to implement Oracle Cloud IAM stack. An Infrastructure Security Consultant undertakes complex work of a high-risk level, often working on several projects. In this role, you will: Interact with senior stakeholders across departments. Understand, decision making, balancing requirements, and deciding between approaches. Produce patterns and support quality assurance. Be the point of escalation for architects. Experience: Minimum 5 years of working experience in Cyber Security Consulting or Advisory. Successfully delivered at-least 2 (two) Cyber Security consulting and implementation projects as IAM service manager in recent years (2 years). Certification (Preferred): Valid OEM certification on IAM (e.g., CIAM, or related to relevant required skills.) - Specialty Knowledge and Skills Candidate should have experience in the below domains Identity Access Management (IAM), Privilege Access Management (PIM/PAM), Database Activity Monitoring (DAM) (e.g., Beyond Trust, CyberArk, Sailpoint, Iraje, Imperva etc.) Certificate Lifecycle Management (CLM, CLMS), Public Key Infrastructure (PKI), Hardware Security Module (HSM). (e.g., Thales, Venafi) Certificate Lifecycle Management (CLM, CLMS), Public Key Infrastructure (PKI), Hardware Security Module (HSM). (e.g., Thales, Venafi) Well versed on email security solution (Proofpoint, Barracuda) Experienced in Nexus Digital Access Expertise in OKTA, Access Gateway, Single Sign-On, Adaptive MFA, Universal directory, Advanced Server Access, API Access Management, Secure authentication, access management systems, Identity as a Service (IDaaS), WS-Federation, OAuth, OpenID Connect 5+ years experience in software development, implementing, integrating, and supporting Oktas cloud technologies Experience with OKTA planning, implementation, and operations Experience in integrating Okta with on-premises directory and Cloud Configure Okta to provide enterprise Single Sign-On services and enable Multi-Factor Authentication (MFA) platform features for internal and external applications Understanding of cloud computing architecture, technical design, and implementations, including Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS) delivery models Experience in various authentication standards such as SAML, OAuth and OpenID Connect Experience with SCIM and knowledge of various API authentication standards Proficient in access control mechanism Role Based Access Control (RBAC), Attribute Based Access Control (ABAC). Knowledge of web services (REST/SOAP) Development experience with ReactJS, NodeJS, Happi, Addison, Joi Validation, Java, Spring Boot, Entmon Monitoring, Python, Docker, Docker hub for storing the docker image, ECP and Kubernetes, CI/CD Pipeline, Hashi Corp Vault, Visual Studio Code, Eclipse, S3, Lambda, CloudFront Experience with various LDAP products including AD Priority 1 Oracle APP Cloud skills required(Oracle Cloud IAM) Oracle WebLogic Kubernetes, Oracle Unified Directory Oracle Access Management, Oracle Identity Governance Oracle HTTP Server (OHS), Oracle Unified Directory Services Manager Oracle Redis cluster, Oracle UI Components Oracle Identity Cloud Service (IDCS) Automated deployment tool Terraform Authentication Schemes SSL Certificates for WLS Domains, Oracle Kubernetes Oracle Kubernetes POD management Application release management and testing Application patch management running on Kubernetes PODs Oracle WAF understanding, Understanding of Oracle load balancers Oracle IAM application security Service Provider and Identity provider Integration management to Oracle Cloud IAM application run. ",
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Title: Monitoring and Observability Consultant Mode of working Full Time Work Location - Bangalore, India. Work Experience 5-8 years ITOps (Monitoring and Observability) Consultant - Lateral Hire (Minimum Relevant Experience 7 Years) Overview: We are seeking a skilled IT Operations Consultant specializing in Monitoring and Observability to design, implement, and optimize monitoring solutions for our customers. The ideal candidate will have a minimum of 7 years of relevant experience, with a strong background in monitoring, observability and IT service management. The ideal candidate will be responsible for ensuring system reliability, performance, and availability by creating robust observability architectures and leveraging modern monitoring tools. Primary Responsibilities: Design end-to-end monitoring and observability solutions to provide comprehensive visibility into infrastructure, applications, and networks. Implement monitoring tools and frameworks (e.g., Prometheus, Grafana, OpsRamp, Dynatrace, New Relic) to track key performance indicators and system health metrics. Integration of monitoring and observability solutions with IT Service Management Tools. Develop and deploy dashboards, alerts, and reports to proactively identify and address system performance issues. Architect scalable observability solutions to support hybrid and multi-cloud environments. Collaborate with infrastructure, development, and DevOps teams to ensure seamless integration of monitoring systems into CI/CD pipelines. Continuously optimize monitoring configurations and thresholds to minimize noise and improve incident detection accuracy. Automate alerting, remediation, and reporting processes to enhance operational efficiency. Utilize AIOps and machine learning capabilities for intelligent incident management and predictive analytics. Work closely with business stakeholders to define monitoring requirements and success metrics. Document monitoring architectures, configurations, and operational procedures. Required Skills: Strong understanding of infrastructure and platform development principles and experience with programming languages such as Python, Ansible, for developing custom scripts. Strong knowledge of monitoring frameworks, logging systems (ELK stack, Fluentd), and tracing tools (Jaeger, Zipkin) along with the OpenSource solutions like Prometheus,Grafana. Extensive experience with monitoring and observability solutions such as OpsRamp, Dynatrace, New Relic, must have worked with ITSM integration (e.g. integration with ServiceNow, BMC remedy, etc.) Working experience with RESTful APIs and understanding of API integration with the monitoring tools. Familiarity with AIOps and machine learning techniques for anomaly detection and incident prediction. Knowledge of ITIL processes and Service Management frameworks. Familiarity with security monitoring and compliance requirements. Excellent analytical and problem-solving skills, ability to debug and troubleshoot complex automation issues ",
Not specified
INR 6.0 - 10.0 Lacs P.A.
Work from Office
Full Time
Title: Postgresql DBA - L2 Mode of working Full Time Work Location - Mumbai, India. Work Experience 2-5 years Job Description JOB Description for PostgreSQL Database Management PostgreSQL Database Administrator (L2) Customer Name BSE LTD LOCATION,Mumbai, Fort DOMAIN PostgreSQL Database Level (L1/L2/L3) L2 Required Relevant Domain Experience 4-6 Years relevant experience Job Type (Onsite/Remote) Onsite Shift details (General/ 24X7) Onsite support 24 X 7 Qualification B.E./Diploma/BSC/BTech (No education gap, Full time) 1.B.E. / Diploma [ In Computers Science, IT , Electronics Telecommunication ] [ Civil and Mechanical not preferred] 2.B.Sc. -IT 3.BCA or MCA [ Full time course from reputed University] 4.No gap in education [ Passing year for S.S.C , H.S.C. and Degree/Diploma to be mentioned] Certifications Required if ANY Relevant Certification (Preference) Specific Remarks/Requirement by customer PostgreSQL Database troubleshooting and Database Cluster Setup working experience is Mandatory Role Purpose We are seeking an experienced PostgreSQL Database Administrator to join our team. The ideal candidate will have strong technical skills and experience in managing and supporting PostgreSQL Database including PCS Cluster Setup and Database migration ,new Database server provisioning, replication, Link Servers, Hardening, patching in day-to-day operations. JOB RESPONSIBILITIES Experience with database technologies, designing data architecture, and monitoring and enforcing database rules Manage and maintain PostgreSQL databases. Ensure database performance, integrity, and security. Design and implement database systems. Monitor and optimize database performance. Ensure data integrity and security. Develop and implement backup and recovery processes. Plan and execute disaster recovery strategies. Collaborate with development and operations teams. Automate database tasks and processes. Perform database tuning and optimization. Troubleshoot database issues and provide solutions. Maintain database documentation and standards. Implement database upgrades and patches. Implement database migration to latest version. Monitor database health and performance metrics. Ensure compliance with data protection regulations. Provide support for database-related issues. Develop and maintain database scripts and tools. Conduct database capacity planning. Stay updated with the latest database technologies and trends. Technical Skills/Knowledge Requirement 1)Database troubleshooting, Hardening, patching and DB cluster Setup (Must) 2)Replication, Link Server (Addon) 3)Database Migration ",
Not specified
INR 9.0 - 14.0 Lacs P.A.
Work from Office
Full Time
Title: Software Development Engineer in Test (Lead) Mode of working - Full Time Work Location - Hyderabad, India. Work Experience 8-12 years Description of Role: As a Software Development Engineer in Test (SDET) Lead , you will play a pivotal role in ensuring the quality and performance of our products through test automation and non-functional testing. You will collaborate closely with global teams to design, develop, and implement automated testing solutions, focusing on both API and web automation using Playwright, as well as performance testing with JMeter. Your expertise in data-driven testing and non-functional testing will be critical in enhancing our CI/CD pipeline and overall software quality. Key Responsibilities: Design, create, and execute automated tests using Playwright and TypeScript to ensure the functionality of our software features and user interfaces. This includes developing smoke tests, regression test suites, and contributing to the overall quality assurance process. Develop and execute data-driven testing strategies to ensure comprehensive test coverage. Perform non-functional testing, including performance testing using JMeter and google lighthouse, to assess system stability and scalability. Participate in ongoing product specification and code reviews to align testing efforts with product goals. Collaborate with cross-functional teams to translate customer requirements into effective automated tests. Contribute to CI/CD pipelines by integrating automated tests and supporting continuous integration and deployment processes. Support various testing efforts, including functional, security, stress, and failure injection testing, with a focus on continuous improvement and innovation. Youll be responsible for actively participating in story grooming sessions, where youll collaborate with the development team to clarify requirements, identify potential risks, and ensure that all test scenarios are accounted for. As the owner of the products quality, you will oversee the entire testing process, from planning to execution, ensuring that all features meet the highest standards before they are released. You will also play a key role in making critical decisions during go/no-go calls, where you will assess the readiness of the product for release. This involves evaluating test results, identifying any unresolved issues, and determining whether the product is fit for deployment. Qualifications: Experience: Minimum of 5 years of development and testing experience, with a strong emphasis on test automation and non-functional testing. Education: Bachelors degree in Engineering, Computer Science, or a related field. Technical Skills: Proficient in using Playwright for API and web automation. Experience with data-driven testing methodologies. Strong understanding of performance testing tools, particularly JMeter. Familiarity with continuous integration/deployment (CI/CD) environments. Knowledge of testing methodologies such as TDD, BDD, and ATDD in an agile setting. Strong coding skills in scripting languages such as Python, JavaScript, C#, or Java. Experience with DevOps tools such as Jenkins, Git, and Octopus Deploy. Solid understanding of object-oriented programming (OOP) principles and coding standards. Soft Skills: Excellent problem-solving abilities, with a keen eye for detail. Strong verbal and written communication skills. Ability to work collaboratively in a global team environment. Preferred Qualifications: Experience with scanning tools like SonarQube, NDepend, or Fortify. Familiarity with exploratory and usability testing. Understanding of RDBMS, SQL, and database testing. ",
Not specified
INR 3.0 - 7.0 Lacs P.A.
Work from Office
Full Time
Title: Cloud Ops Engineer Mode of working Full Time Full time Work from office Hyderabad Work Location Client in Madhapur Hyderabad Key Responsibilities: Automation and CI/CD: Design, implement, and maintain Continuous Integration and Continuous Deployment (CI/CD) pipelines to automate software builds, tests, and releases. Infrastructure as Code: Utilize tools like Terraform, Ansible to manage infrastructure through code, ensuring consistency and repeatability. Monitoring and Performance Tuning: Set up monitoring solutions to track system performance and reliability, and identify and resolve performance bottlenecks. Collaboration: Work closely with development teams to ensure smooth transitions from development to production, fostering a culture of collaboration and shared responsibility for deployments. Cloud Management: Manage cloud infrastructure (AWS, Azure, GCP ), including resource provisioning, scaling, and cost management.Azure is mandatory. Security Practices: Implement and advocate for security best practices throughout the development lifecycle. Documentation: Maintain clear and comprehensive documentation for systems, processes, and procedures. Troubleshooting and Support: Provide support for system outages or performance issues, including root cause analysis and resolution. Technical Skills: Proficient in scripting languages - Python, Powershell, Bash, etc. Experience with CI/CD tools - Azure DevOps. Familiarity with containerization technologies - Docker, Kubernetes Strong knowledge of version control systems - Git, Azure ADO Git. Understanding of networking - Azure networking, Fortinet SD WAN, ZScaler and WAF Understanding of Databases - Azure SQL, Azure MI, MSSQL server, Azure PostgreSQL, Cosmos DB etc, Understanding of server management - Windows, Linux Preferred Qualifications: Experience with Agile methodologies. Knowledge of microservices architecture. Familiarity with monitoring tools - Azure monitor , Prometheus, Grafana ",
Not specified
INR 2.0 - 6.0 Lacs P.A.
Work from Office
Full Time
Title: Azure Dev Ops Engineer Mode of working Full Time Full time Work from office Hyderabad Work Location Client in Madhapur Hyderabad Key Responsibilities: Automation and CI/CD: Design, implement, and maintain Continuous Integration and Continuous Deployment (CI/CD) pipelines to automate software builds, tests, and releases. Infrastructure as Code: Utilize tools like Terraform, Ansible to manage infrastructure through code, ensuring consistency and repeatability. Monitoring and Performance Tuning: Set up monitoring solutions to track system performance and reliability, and identify and resolve performance bottlenecks. Collaboration: Work closely with development teams to ensure smooth transitions from development to production, fostering a culture of collaboration and shared responsibility for deployments. Cloud Management: Manage cloud infrastructure (AWS, Azure, GCP ), including resource provisioning, scaling, and cost management.Azure is mandatory. Security Practices: Implement and advocate for security best practices throughout the development lifecycle. Documentation: Maintain clear and comprehensive documentation for systems, processes, and procedures. Troubleshooting and Support: Provide support for system outages or performance issues, including root cause analysis and resolution. Technical Skills: Proficient in scripting languages - Python, Powershell, Bash, etc. Experience with CI/CD tools - Azure DevOps. Familiarity with containerization technologies - Docker, Kubernetes Strong knowledge of version control systems - Git, Azure ADO Git. Understanding of networking - Azure networking, Fortinet SD WAN, ZScaler and WAF Understanding of Databases - Azure SQL , Azure MI, MSSQL server , Azure PostgreSQL, Cosmos DB etc, Understanding of server management - Windows , Linux Preferred Qualifications: 6 to 8 years of experience Below skills are must have Azure Azure DevOps Azure Kubernetes Services Docker Helm CI/CD Concept ",
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Title: Data Modeler Mode of working Full Time Work Location - Chennai, India. Work Experience 8-12 years Job Description Key Skills Required for the Data Modeler Role: Data Modelling Expertise Ability to analyse and translate business needs into long-term data models. Metadata Management Strong knowledge of metadata management and related tools. Machine Learning Experience 5-8+ years of experience in machine learning in production. Statistical Analysis Knowledge of mathematical foundations and statistical methods. Database Systems Evaluating and optimizing existing data systems. Data Flow Design Creating conceptual data models and data flows. Coding Best Practices Developing best practices for data coding to ensure consistency. System Optimization Updating and troubleshooting data systems for efficiency. Collaboration Skills Working with cross-functional teams (Product Owners, Data Scientists, Engineers, Analysts, Developers, Architects). Technical Documentation Preparing training materials, SOPs, and knowledge base articles. Communication Presentation Strong interpersonal, communication, and presentation skills. Multi-Stakeholder Engagement Ability to work with multiple stakeholders in a multicultural environment. Data Modelling Certification Desirable but not mandatory. ",
Not specified
INR 14.0 - 18.0 Lacs P.A.
Work from Office
Full Time
Location - Hyderabad We are looking for good quality resources with product company background and strong education (B/M-Tech/MCA only). Exp Level - 8 to 12 Years Posting Title Architect Data Services Job Summary The Maps team is developing tools to analyse, visualize, process, manage and curate data on a large scale. We are looking for engineering leads who can be part of the team building massive, scalable, distributed systems that process geospatial data at scale to power our products. Key Qualifications 10+ years of experience working in a software product development organization building modern and scalable big data applicaFons Expert level knowledge in functional programming, preferably Scala Experience in developing REST-based microservices, preferably with Play as an MVC framework In-depth knowledge of developing applicaFons using Kafka messaging Knowledge of Solr and Elastic Search queries is a plus. Experience working with GraphQL, and complex APIs to build data-driven applications. Experience with NoSQL and SQL databases and familiarity with the Cassandra database Experience in building low latency high transactional services Fluency in Hadoop technologies and big data processing technologies, e.g. Spark, YARN, HDFS, Oozie, Hive Knowledge in information retrieval and machine learning Ability to effectively communicate with and manage stakeholders who may be located remotely. Gather and understand funcFonal requirements, providing clear documentation and feedback. Experience in overseeing the progress of the development team. Track project milestones, provide estimates, and ensure timely delivery of high-quality soLware and services. Offer technical direcFon and support to team members. Review code, provide feedback, and ensure adherence to best practices. Contribute to the design process by providing insights on user experience and interface design. Should possess excellent troubleshooting skills. Description The Maps team is developing tools to analyse, visualize, process, manage, and curate data on a large scale. Our team combines disparate signals such as data analytics, community engagement, and user feedback to improve the Maps Platform. On any given day you may be asked to analyse large data sets to identify errors in the map, design and implement a complex algorithm for resolving the issue, review the solution with a team of engineers and analysts, and integrate the resulting solution into the data processing pipeline. We are looking for engineers who can be part of the team building a massive, scalable, distributed system for enabling the maps data platform. Successful candidates will have exceptional engineering and communication skills and believe that data-driven feedback leads to great products. Education Bachelors / Masters degree in CS or a related field Additional Requirements Familiarity with Geospatial concepts is a plus ",
Not specified
INR 10.0 - 14.0 Lacs P.A.
Work from Office
Full Time
ROLE OVERVIEW: We are seeking a Analytics Developer with deep expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. The ideal candidate will focus on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports that drive strategic decision-making. This role involves close collaboration with both technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. KEY RESPONSIBILITIES: Leverage Databricks to develop and optimize scalable data pipelines for real-time and batch data processing. Design and implement Databricks Notebooks for exploratory data analysis, ETL workflows, and machine learning models. Manage and optimize Databricks clusters for performance, cost efficiency, and scalability. Use Databricks SQL for advanced query development, data aggregation, and transformation. Incorporate Python and/or Scala within Databricks workflows to automate and enhance data engineering processes. Develop solutions to integrate Databricks with other platforms, such as Azure Data Factory, for seamless data orchestration. Create interactive and visually compelling Power BI dashboards and reports to enable self-service analytics. Leverage DAX (Data Analysis Expressions) for building calculated columns, measures, and complex aggregations. Design effective data models in Power BI using star schema and snowflake schema principles for optimal performance. Configure and manage Power BI workspaces, gateways, and permissions for secure data access. Implement row-level security (RLS) and data masking strategies in Power BI to ensure compliance with governance policies. Build real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources. Provide end-user training and support for Power BI adoption across the organization. Develop and maintain ETL/ELT workflows, ensuring high data quality and reliability. Implement data governance frameworks to maintain data lineage, security, and compliance with organizational policies. Optimize data flow across multiple environments, including data lakes, warehouses, and real-time processing systems. Collaborate with data governance teams to enforce standards for metadata management and audit trails. Work closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems. Troubleshoot and resolve technical challenges related to data integration, analytics performance, and reporting accuracy. Stay updated on the latest advancements in Databricks, Power BI, and data analytics technologies. Drive innovation by integrating AI/ML capabilities into analytics solutions using Databricks. Contributes to the enhancement of organizational analytics maturity through scalable and reusable architectures. REQUIRED SKILLS: Self-Management - You need to possess the drive and ability to deliver on projects without constant supervision. Technical - This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic - You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language - Client has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication - Your daily job will regularly require communication with Client team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelors degree in Computer Science, Data Science, or a related field (Masters preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. ENVIRONMENT: 100% performed in climate-controlled internal office environment working under normal office conditions. While performing the duties of this job, the employee is regularly required to sit; stand; walk; use hands and finger to feel and handle; reach with arms and hands; talk and hear. While performing the duties of this job, the employee frequently is required to stoop, kneel, and crouch; lift weight or exert a force up to a maximum of 13kgs. ADDITIONAL: Follow the Company HR Policy, the Code of Business Conduct and department policies and procedures, including protecting confidential company information, attending work punctually and regularly, and following good safety practices in all activities. The responsibilities associated with this job will change from time to time in accordance with the Companys business needs. More specifically, the incumbent may be required to perform additional and/or different responsibilities from those set forth above. The above declarations are not intended to be an all-inclusive list of the duties and responsibilities of the job described, nor are they intended to be such a listing of the skills and abilities required to do the job. Rather, they are intended. ",
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
ROLE OVERVIEW: We are looking for a skilled Boomi Senior Integration Developer to design, develop, and maintain integrations on the Dell Boomi AtomSphere platform. The ideal candidate will build integration processes, configure connectors, and troubleshoot data flows to ensure smooth system interoperability. This role requires a strong understanding of integration concepts, Boomi platform components (Integrations, API Management, EDI, Event Streams), and working knowledge of cloud architecture. KEY RESPONSIBILITIES: Design, implement, and maintain integration processes using the Boomi AtomSphere platform. Develop and configure integrations using Boomis connectors (e.g., REST, SOAP, databases, ERP, CRM systems). Participate in requirements gathering sessions and translate business requirements into technical integration designs. Collaborate with stakeholders to understand data flows between systems and ensure data integrity across platforms. Build integration components, including maps, processes, profiles, and connectors. Monitor and troubleshoot integration issues, performance bottlenecks, and data synchronization problems. Conduct unit testing, ensure integration quality, and perform ongoing optimizations for performance and scalability. Provide detailed documentation for integration processes, components, and troubleshooting steps. Stay informed on Boomi product updates, new features, and emerging integration technologies to continuously enhance integration capabilities. Contribute to the Integrations Center of Excellence (CoE), fostering knowledge sharing and best practices. Provide technical leadership and mentorship to Boomi developers and cross-functional teams, ensuring adherence to integration best practices, performance standards, and security policies. Ensure proper data governance, integrity, and compliance across integrations, especially in hybrid cloud environments. Identify, evaluate, and select appropriate Boomi connectors and components based on system integration requirements. REQUIRED SKILLS: Self-Management You need to possess the drive and ability to deliver on projects without constant supervision. Technical This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language Client has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication Your daily job will regularly require communication with Client team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelors degree in technology or a related field required. 8+ years of experience in the middleware integration space, including 6+ years specifically with the Boomi integration platform. Mandatory Boomi Certifications: Boomi Associate Integration Developer Boomi Professional Integration Developer Boomi Associate Event Streams Boomi API Design Boomi Professional API Management Boomi Associate EDI for X12 Boomi Integration Administrator Boomi Development and Application Architecture Strong understanding of integration requirements and the ability to validate them effectively. Experience with programming languages such as Java and Groovy scripting is a plus. Strong knowledge of API formats (JSON, XML) and Web Service Protocols (SOAP, REST). Expertise in API security practices and methods for authentication and authorization of access. Proficient in leveraging API testing methodologies, including unit testing, integration testing, and contract testing. Familiarity with tools like Postman, SoapUI, and automated testing frameworks. Proven ability to establish and maintain strong relationships, prioritize effectively, and adapt to changing priorities. Proficiency in managing technical and administrative resources while effectively communicating with non-technical stakeholders. Strong analytical and problem-solving skills, with attention to detail. Ability to function well in a fast-paced and, at times, high-stress environment. Excellent communication skills, both verbal and written, are essential. ENVIRONMENT: 100% performed in climate-controlled internal office environment working under normal office conditions. While performing the duties of this job, the employee is regularly required to sit; stand; walk; use hands and finger to feel and handle; reach with arms and hands; talk and hear. While performing the duties of this job, the employee frequently is required to stoop, kneel, and crouch; lift weight or exert a force up to a maximum of 13kgs. ADDITIONAL: Follow the Company HR Policy, the Code of Business Conduct and department policies and procedures, including protecting confidential company information, attending work punctually and regularly, and following good safety practices in all activities. The responsibilities associated with this job will change from time to time in accordance with the Companys business needs. More specifically, the incumbent may be required to perform additional and/or different responsibilities from those set forth above. The above declarations are not intended to be an all-inclusive list of the duties and responsibilities of the job described, nor are they intended to be such a listing of the skills and abilities required to do the job. Rather, they are intended. ",
Not specified
INR 4.0 - 8.0 Lacs P.A.
Work from Office
Full Time
We are seeking a skilled and dedicated Boomi Administrator to manage and optimize our Boomi Integration Platform. The ideal candidate will be responsible for the administration, configuration, and maintenance of Boomi environments. Complete Description: The Boomi Administrator will work to ensure the seamless integration of applications and efficient data flow across the organization. Key Responsibilities: Configure, deploy, and maintain Boomi integration processes, including workflows, connectors, and mappings. Manage and maintain the infrastructure, OS and the deployed software Monitor and manage Boomi environments to ensure high availability, performance, and reliability. Troubleshoot and resolve issues related to Boomi integrations and data flows. Collaborate with business and technical teams to understand integration requirements and design solutions. Develop and implement integration processes, including data transformation, routing, and error handling. Ensure adherence to integration of best practices and standards. Set up and manage monitoring tools to track the performance and health of integrations. Analyze and optimize integration processes for performance and scalability. Generate and review reports on integration metrics and system performance. Document integration processes, configurations, and procedures for future reference and training. Provide technical support and guidance to users and other team members. Maintain up-to-date knowledge of Boomi features and updates. Ensure compliance with security policies and data protection regulations in all integrations. Manage user access and permissions within the Boomi platform. Qualification Bachelors degree in computer science, Information Technology, or a related field. Boomi certifications Experience with cloud platforms (e.g., AWS, Azure) and other integration tools. Familiarity with enterprise applications (e.g., CRM, ERP) and their integration requirements. Strong understanding of integration concepts, data transformation, and API management. Experience with Boomi connectors, process design, and error handling. Proficiency in scripting and programming languages (e.g., JavaScript, SQL) is a plus. Skills: Skill Required / Desired Amount of Experience Expertise Rating Move Bachelors degree in computer science, Information Technology, or a related field. Required Boomi certifications Required Experience with cloud platforms (e.g., AWS, Azure) and other integration tools. Required Familiarity with enterprise applications (e.g., CRM, ERP) and their integration requirements. Required Strong understanding of integration concepts, data transformation, and API management. Required Proficiency in scripting and programming languages (e.g., JavaScript, SQL) is a plus. Required ",
Not specified
0.0 - 0.0 Lacs P.A.
Hybrid
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
Hybrid
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
Remote
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
Remote
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
Remote
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
Remote
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Full Time
FIND ON MAP
Gallery
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Chrome Extension