8 - 13 years
40 - 45 Lacs
Posted:1 month ago|
Platform:
Work from Office
Full Time
Responsibilities: Design and articulate enterprise-scale data architectures incorporating multiple platforms including Open Source and proprietary Data Platform solutions - Databricks, Snowflake, and Microsoft Fabri c, to address customer requirements in data engineering, data science, and machine learning use cases. Conduct technical discovery sessions with clients to understand their data architecture, analytics needs, and business objectives Design and deliver proof of concepts (POCs) and technical demonstrations that showcase modern data platforms in solving real-world problems Create comprehensive architectural diagrams and i mplementation roadmaps for complex data ecosystems spanning cloud and on-premises environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on specific customer requirements Lead technical responses to RFPs (Request for Proposals), crafting detailed solution architectures, technical approaches, and implementation methodologies Create and review techno-commercial proposals, including solution scoping, effort estimation, and technology selection justifications Collaborate with sales and delivery teams to develop competitive, technically sound proposals with appropriate pricing models for data solutions Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, or a related technical field. 8+ years of experience in data architecture, data engineering, or solution architecture roles Proven experience in responding to RFPs and developing techno-commercial proposals for data solutions Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines Hands-on experience with multiple data platforms including Databricks, Snowflake, and Microsoft Fabric Strong understanding of big data technologies including Hadoop ecosystem, Apache Spark, and Delta Lake Experience with modern data processing frameworks such as Apache Kafka and Airflow Proficiency in cloud platforms ( AWS, Azure, GCP ) and their respective data services Knowledge of system monitoring and observability tools. Experience implementing automated testing frameworks for data platforms and pipelines Expertise in both relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB) Understanding of AI/ML technologies and their integration with data platforms Familiarity with data integration patterns, ETL/ELT processes , and data governance practices Experience designing and implementing data lakes, data warehouses, and machine learning pipelines Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) Strong problem-solving skills and ability to think creatively to address customer challenges Relevant certifications such as Databricks, Snowflake, Azure Data Engineer, or AWS Data Analytics are a plus Willingness to travel as required to meet with customers and attend industry events If interested plz contact Ramya 9513487487, 9342164917
INTELLI SEARCH
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections INTELLI SEARCH
Noida, Gurugram
40.0 - 45.0 Lacs P.A.
4.0 - 7.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
Experience: Not specified
2.0 - 3.0 Lacs P.A.
Hyderabad
12.0 - 18.0 Lacs P.A.
Noida, Bengaluru
25.0 - 30.0 Lacs P.A.
Noida, Bengaluru
9.0 - 13.0 Lacs P.A.
40.0 - 50.0 Lacs P.A.
Bengaluru
8.0 - 12.0 Lacs P.A.
Bengaluru
4.0 - 8.0 Lacs P.A.