Testingxperts

Testingxperts

Big Data/ Hadoop with AWS Cloud and Spark Experience

Role

Big Data/ Hadoop with AWS Cloud and Spark Experience

Job type

Contract

Posted

106 months ago

Share this job

Salary

Not disclosed by employer

Job description

Hi

Hope you are doing great,

This is Siva Maddineni, currently we are looking for a Big Data || Madison WI (Only GC's or USC's)

At the time of Submission, I need VISA and ID proof.

Check with your consultants before submitting me. If they are comfortable with this requirement then only forward me.

Client: Infosys

Title: Big Data

Location: Madison WI (Only GC's or USC's)

Duration: 6 Months

Experience Need: Min 9+ yrs

Rate: $55-60/hr on w2

Job Description:

Must Have Skills: 1. AWS cloud 2. Big Data Hadoop 3. Spark 4. HBase 5. Azure

Detailed Job Description: Responsible for architecting, designing and developing cloud infrastructure and applications. Provide operational and technical expertise to initiatives for the Big Data analytics platform, data and applications, addressing a broad range of technologies including Big Data Hadoop, Spark, Kafka, HBase, Cassandra, Eslatic Search on Linux hosted on Azure or AWS cloud.

Top 3 responsibilities you would expect the Subcon to shoulder and execute*: 1. Responsible for architecting, designing and developing cloud infrastructure and applications 2. Provide operational and technical expertise to initiatives for the Big Data analytics platform, data and applications, addressing a broad range of technologies including Big Data Hadoop, Spark, Kafka, HBase, Cassandra, Eslatic Search on Linux hosted on Azure or AWS cloud. 3. Researching, architecting, designing and deploying new tools, frameworks and patterns to build a sustainable big data platform.

All your information will be kept confidential according to EEO guidelines.

Resume ExampleCover Letter Example

Explore more