You will play a strategic and technical role requiring analytics and development knowledge and skills. This role will require collaboration within a team of technologists to produce enterprise scale solutions for our clients’ needs. They will utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design decisions to insure the necessary health of the overall solution.
The individual can morph their career in a number of paths such as No SQL Expert, Cloud Computing expert, or Big Data architect.
- Lead, design, develop, and deliver large-scale Azure data systems, data processing, and data transformation projects.
- Execute technical feasibility assessments and project estimates for moving databases and data processing to Azure.
- Design and advocate solutions using modern cloud technologies, design principles, integration points, and automation methods.
- Mentor and share knowledge with customers as well as provide architecture reviews, discussions, and prototypes.
- Participate in overall engagement from strategy, assessment, migration, and implementations.
- Work with customers to deploy, manage, and audit best practices for cloud products.
- More than 3+ years of experience in Big Data application development
- Hands-on experience with the Hadoop stack (Hadoop MR, Hdfs, Pig, Hive, Sqoop)
- Experienced in computation frameworks like Spark, Storm, Flink using Java/Scala
- Well versed with streaming data processing using Kafka, Spark Streaming, Storm etc.
- Experience in working on Big Data application in cloud environment using Azure
- Excellent knowledge of NoSQL platforms like Hbase, MongoDb, Cassandra etc.
- Good knowledge of database technology with hands on experience on databases such as Oracle, SQL Server etc
- Self-starter, with a keen interest in technology and highly motivated towards success
- Excellent oral and written communication, presentation, and analytical skills
- Demonstrated experience designing, implementing, and supporting enterprise-grade technical solutions in the cloud for meeting complex business data requirements.
- Experience with Databricks and using Spark for data processing.
- Experience with Azure Data Factory – ADF.
- Advanced experience with different query languages (i.e. T-SQL, PostgreSQL, PL-SQL).
- Experience designing and building data marts, warehouses, customer profile databases, etc.
- Experience with data modeling, table design, and mapping business needs to data structures.
- Experience with Azure Data Lake, Azure SQL Data Warehouse, and Cosmos DB are a plus.
- Experience with Data Management Gateway, Azure Storage Options, Stream Analytics, and Event Hubs is a plus.
- Experience with other cloud based big data architectures is a plus
Your role is focused on Design, Development and delivery of solutions involving:
- Data Integration, Governance & Wrangling
- Data Storage and Computation Frameworks, Performance Optimizations
- Analytics & Visualizations
- Infrastructure & Cloud Computing
- Data Management Platforms