Skip to content

Keep getting asked for credentials in CLI? #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
nimsim opened this issue Apr 23, 2018 · 9 comments
Open

Keep getting asked for credentials in CLI? #1

nimsim opened this issue Apr 23, 2018 · 9 comments

Comments

@nimsim
Copy link

nimsim commented Apr 23, 2018

Heya,

This was just what I was looking for, but it seems I'm having major difficulties getting it to actually run.

Done exactly what you've written, tried with both API-key and without. Keep getting this error:

File "/env/local/lib/python2.7/site-packages/pandas_gbq/gbq.py", line 194, in get_credentials credentials = self.get_user_account_credentials() File "/env/local/lib/python2.7/site-packages/pandas_gbq/gbq.py", line 370, in get_user_account_credentials credentials = app_flow.run_console() File "/env/local/lib/python2.7/site-packages/google_auth_oauthlib/flow.py", line 362, in run_console code = input(authorization_code_message) EOFError: EOF when reading a line

Do you have any idea why?

@sungchun12
Copy link
Owner

Hey nimsim!

I recommend updating the requirements.txt with the latest gbq package. https://pandas-gbq.readthedocs.io/en/latest/

Also, I'd double check all your apis are enabled and that you're making sure the project and dataset IDs are aligned within the Python script and your bigquery interface.

@nimsim
Copy link
Author

nimsim commented Apr 24, 2018 via email

@nimsim
Copy link
Author

nimsim commented Apr 24, 2018

Updated requirements.txt with the latest package, but no go :(

@sungchun12
Copy link
Owner

The script calls to the api token from socrata.

Looking at the error code in more detail, it definitely has to do with the pandas gbq package and access issues to bigquery.

Try opening up the "append_data" script in a datalab notebook and run it manually to see if it breaks.

Send me a screenshot of your append_data code. I have a hunch the parameters are entered incorrectly.

@nimsim
Copy link
Author

nimsim commented Apr 24, 2018

Thanks for looking into it, append_data below.

Trying datalab later tonight

#this script appends live Chicago traffic data into BigQuery, there will be duplicates
#but that's accounted for with a saved view removing duplicates using SQL

from __future__ import print_function, absolute_import #package to smooth over python 2 and 3 differences
import pandas as pd #package for dataframes
from sodapy import Socrata #package for open source api
from google.datalab import Context #package for datalab
import time
from datetime import datetime, timedelta
import logging #package for error logging

#tracks error messaging
logging.basicConfig(level=logging.INFO)

# Unauthenticated client only works with public data sets. Note 'None'
# in place of application token, and no username or password:
#client = Socrata("data.cityofchicago.org", None)

#indent the run function by 1 tab
def run():
# Example authenticated client (needed for non-public datasets):
	client = Socraa("data.cityofchicago.org", "tokenremoved")
	                 
	# First 2000 results, returned as JSON from API / converted to Python list of
	# dictionaries by sodapy.
	results = client.get("8v9j-bter", limit=2000)
	
	# Convert to pandas DataFrame
	results_df = pd.DataFrame.from_records(results)
	
	
	#have this go directly into bigquery syntax-https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.to_gbq.html
	results_df.to_gbq('chicago_traffic.demo_data', "demo-nim", chunksize=2000, verbose=True, if_exists='append')

@sungchun12
Copy link
Owner

did you create a dataset id named "chicago_traffic" and an empty table named "demo_data"?

@nimsim
Copy link
Author

nimsim commented Apr 25, 2018

I didn't create demo_data. I read it as it would create one itself if it didn't exist. Also instructions only mention creating the dataset ID.

You need to have at least one field in an empty table, did you plot in all fields expected from the stream?
Edit: Ignore what I wrote above, that's not needed :)

@nimsim
Copy link
Author

nimsim commented Apr 25, 2018

Tried to set up a new project and going through the setup again, this time with demo_data.
No go. I'm actually gonna focus on doing Cloud Functions Get method to PubSub and from PubSub to BigQuery. Then I won't have to deal with app engine at all. Hopefully it'll work :)

Thanks for the help, and sorry for asking so many questions.

@sungchun12
Copy link
Owner

My mistake on creating the demo_data table. You shouldn't have to. Some weird access issue must be going on/the latest packages aren't correct in the requirements.txt file. So strange that it's not working as mine is working just fine.

If you need more help, feel free to ask! No need to apologize 😃

Would love to see your cloud functions demo when it's finished! I saw a couple medium blogs about it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants