Google Cloud Platform Data Architect Architecture - Phoenix, AZ at Geebo

Google Cloud Platform Data Architect

We have a job opportunity for you as Google Cloud Platform Data Architect, our clients are based at Phoenix, AZ. Please find the job description below, if you are available and interested, please send us your word copy of your resume with the following details to Position:
Google Cloud Platform Data ArchitectLocation:
Phoenix, AZContract :
12
Months
Job Description:
Bachelor degree in Engineering or Computer Science or equivalent OR Master in Computer Applications or equivalent.A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform is a must.Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition Minimum of 12 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of Google Cloud Platform data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Githubperforming detail assessments of current state data platforms and creating an appropriate transition path to Google Cloud Platform cloudA solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform is a must.Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value propositionMinimum of 8 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of Google Cloud Platform data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Githubperforming detail assessments of current state data platforms and creating an appropriate transition path to Google Cloud Platform cloudExperience with Data lake, data warehouse ETL build and designExperience with Google Cloud Services such as Streaming
Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query & Big TableProven ability in one or more of the following programming or scripting languages- Python, JavaScript, Java, Recommended Skills Apache Hive Apache Spark Architecture Assessments Big Query Big Table Implementations Apply to this job. Think you're the perfect candidate? Apply on company site $('.external-apply-email-saved').on('click', function (event) window.ExternalApply = window.open('/interstitial?jobdid=j3q77k63bh0zzflzdhx', 'ExternalApply-j3q77k63bh0zzflzdhx'); ); $(document).ready( function() $(#ads-desktop-placeholder).html(
n
n
n Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.