You openingopen a link to private Jupyter Lab in the cloudcloud and that authenticates you with your GCP account. From Jupyter Lab you are uploading data from the private file on Google Cloud Storage, to the Big Query (right from the Lab)Lab.) Later, you are executing several queries to BQ and visualizing data right from the Jupyter Lab. Finally, you are executing the training of the model with Tensor Flow and the data from BQ on the GPU, attached to the Jupyter Lab. Now you are uploading the trained model back to GCS. And all this without leaving the managed Jupyter Lab. Sounds like a magic, isn’tdoesn’t it? Now let me show you the screenshot of what you will have by the time you have finished reading this article:
The text above was approved for publishing by the original author.
Previous
     
Next
받은편지함으로 가서 저희가 보낸 확인 링크를 눌러서 교정본을 받으세요. 더 많은 이메일을 교정받고 싶으시면:
또는