Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
12 - 22 years
35 - 65 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - Candidates should have minimum 2 Years hands on experience as Azure databricks Architect If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Kolkata
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Noida
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
9 - 12 years
6 - 16 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Job Title: AWS Architect Location: Mumbai Job Type: Full-Time About Us: Capgemini is a leading IT company dedicated to delivering innovative solutions and services. We are seeking a skilled AWS Architect to join our dynamic team and help us design and implement scalable, reliable, and secure cloud solutions. Job Description: Responsibilities: Design and implement AWS cloud solutions that meet business requirements. Collaborate with cross-functional teams to define, design, and deliver new features. Develop and maintain cloud infrastructure using AWS services such as EC2, S3, RDS, Lambda, and VPC. Ensure the security, scalability, and reliability of cloud solutions. Provide technical leadership and guidance to development teams. Conduct performance tuning, monitoring, and optimization of cloud infrastructure. Stay updated with the latest AWS services and features to recommend improvements. Troubleshoot and resolve issues related to cloud infrastructure and services. Create and maintain documentation for cloud architecture and best practices. Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as an AWS Architect or similar role. In-depth knowledge of AWS services including EC2, S3, RDS, Lambda, VPC, CloudFormation, and IAM. Experience with infrastructure as code (IaC) tools such as Terraform or AWS CloudFormation. Strong understanding of networking, security, and database management in the cloud. Excellent problem-solving skills and ability to work in a fast-paced environment. Strong communication and collaboration skills. AWS Certified Solutions Architect (Associate or Professional) is a plus.
Posted 2 months ago
10 - 15 years
15 - 25 Lacs
Kolar
Hybrid
Required Qualifications: Looking for AWS architect or equivalent profile with Java, springboot background (must to have) Minimum 10+ years' experience Good hands on experience in core AWS services Understand the current application infrastructure and suggest changes to it Enhance cloud capability by creating and implementing cloud application patterns Develop and implement ways to move apps and workloads to the cloud
Posted 2 months ago
10 - 15 years
15 - 25 Lacs
Mandya
Hybrid
Required Qualifications: Looking for AWS architect or equivalent profile with Java, springboot background (must to have) Minimum 10+ years' experience Good hands on experience in core AWS services Understand the current application infrastructure and suggest changes to it Enhance cloud capability by creating and implementing cloud application patterns Develop and implement ways to move apps and workloads to the cloud
Posted 2 months ago
10 - 15 years
15 - 25 Lacs
Hosur
Hybrid
Required Qualifications: Looking for AWS architect or equivalent profile with Java, springboot background (must to have) Minimum 10+ years' experience Good hands on experience in core AWS services Understand the current application infrastructure and suggest changes to it Enhance cloud capability by creating and implementing cloud application patterns Develop and implement ways to move apps and workloads to the cloud
Posted 2 months ago
10 - 15 years
15 - 25 Lacs
Pune
Hybrid
Required Qualifications: Looking for AWS architect or equivalent profile with Java, springboot background (must to have) Minimum 10+ years' experience Good hands on experience in core AWS services Understand the current application infrastructure and suggest changes to it Enhance cloud capability by creating and implementing cloud application patterns Develop and implement ways to move apps and workloads to the cloud
Posted 2 months ago
12 - 20 years
20 - 35 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Urgent requirement for AWS Architect Location - PAN India Experience : 12+ Yrs Experience Skillsets: Strong experience in architectural designing on AWS Cloud Strong IT Infrastructure experience Should have experience on automation tool- CloudFormation Should have experience on DevSecOps Transit gateways experience AWS Network & Security practices Share Updated resume on namrata@kksoftwareassociates.com
Posted 2 months ago
10 - 18 years
18 - 25 Lacs
Bengaluru
Work from Office
SA Hybrid Cloud (AWS and Azure) We are seeking an experienced Cloud Engineer to design, implement, and manage scalable cloud solutions. The ideal candidate should have a deep understanding of Cloud services, cloud security, automation, and DevOps practices. Key Responsibilities: • Design and implement AWS/Azure cloud infrastructure following best practices for scalability, security, and cost optimization. • Deploy and manage cloud resources using Infrastructure as Code (IaC) tools such as Terraform, AWS CloudFormation, or CDK. • Ensure high availability and disaster recovery strategies using AWS services like Auto Scaling, Route 53, CloudFront, and RDS. • Implement cloud security best practices, including IAM, security groups, encryption, and compliance frameworks. • Monitor cloud performance and optimize costs using AWS CloudWatch, AWS Config, and AWS Cost Explorer. • Automate cloud deployments, provisioning, and configuration management using CI/CD pipelines, Ansible, and AWS CodePipeline. • Manage and troubleshoot AWS networking (VPC, VPN, Transit Gateway, Direct Connect, etc.). • Work with containerized applications using Docker, Kubernetes (EKS), and AWS Fargate. • Support serverless architectures using AWS Lambda, API Gateway, and DynamoDB. • Collaborate with cross-functional teams to drive cloud adoption and migration strategies. Required Skills & Qualifications: • Implement a hybrid AWS infrastructure and region public cloud covering key domains such as network configuration, resource deployment management, security and compliance, operational integration and automation, tagging strategy, management groups through Infrastructure as a Code via AWS CloudFormation. • Develop and maintain comprehensive network configurations to ensure seamless integration between on premises and cloud environments. • Experienced in On Premises to Cloud migrations. • Strong cloud native experience, including PaaS services. • Strong background in hybrid cloud solutions, including modern networking and interconnect technologies. • Hands on experience in major Hypervisors, Server, Storage & Network technologies. • Should have knowledge on Cloud Automation/ Orchestration software. • Hands on experience in design and implementation of Private/ Public / Hybrid cloud environment using VMWARE product suite. • Should have hands-on /design experience on VMWARE ESXi, vCenter, vRealize Suite, vRA, vRO. • Expertise in designing and Implementing Private / Public /Hybrid cloud. • 5+ years of experience working with AWS cloud technologies. • Strong expertise in core AWS services like EC2, S3, RDS, Lambda, IAM, VPC, and CloudFormation. • Experience with Infrastructure as Code (IaC) tools like Terraform, AWS CloudFormation, or CDK. • Proficiency in scripting/programming languages such as Python, Bash, or PowerShell. • Hands-on experience with AWS security and compliance frameworks. • Strong knowledge of AWS networking (VPC, Route 53, ALB, NLB, VPN, Direct Connect, etc.). • Experience with CI/CD tools like Jenkins, GitHub Actions, AWS CodeDeploy, and CodePipeline. • Familiarity with container orchestration tools like Kubernetes (EKS) and AWS Fargate. • Strong troubleshooting and problem-solving skills in AWS environments. Preferred Qualifications: • AWS Certifications such as AWS Certified Solutions Architect, AWS Certified DevOps Engineer, or AWS Certified Security Specialist. • Experience with multi-cloud environments (Azure, Google Cloud) is a plus. • Knowledge of serverless computing and event-driven architectures. • Experience with observability and monitoring tools (AWS CloudWatch, ELK, Prometheus, Grafana).
Posted 3 months ago
8 - 10 years
20 - 25 Lacs
Pune, hybrid
Hybrid
Hands-on experience with AWS Glue or Databricks, PySpark, and Python. Minimum of 2 years of hands-on expertise in PySpark, including Spark job performance optimization techniques. Minimum of 2 years of hands-on involvement with AWS Cloud Hands on experience in StepFunction, Lambda, S3, Secret Manager, Snowflake/Redshift, RDS, Cloudwatch Proficiency in crafting low-level designs for data warehousing solutions on AWS cloud. Proven track record of implementing big-data solutions within the AWS ecosystem including Data Lakes. Familiarity with data warehousing, data quality assurance, and monitoring practices. Demonstrated capability in constructing scalable data pipelines and ETL processes. Proficiency in testing methodologies and validating data pipelines. Experience with or working knowledge of DevOps environments. Practical experience in Data security services. Understanding of data modeling, integration, and design principles. Strong communication and analytical skills. A dedicated team player with a goal-oriented mindset, committed to delivering quality work with attention to detail.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2