Download the following files from Oracle
- instantclient-basic-
$VERSION-macosx-x64.zip - instantclient-sdk-
$VERSION-macosx-x64.zip
Edit ~/.zshrc or ~/.bashrc, add following:
| import signal,functools #下面会用到的两个库 | |
| class TimeoutError(Exception): pass #定义一个Exception,后面超时抛出 | |
| def timeout(seconds, error_message = 'Function call timed out'): | |
| def decorated(func): | |
| def _handle_timeout(signum, frame): | |
| raise TimeoutError(error_message) | |
| def wrapper(*args, **kwargs): | |
| signal.signal(signal.SIGALRM, _handle_timeout) | |
| signal.alarm(seconds) |
| // Use Gists to store code you would like to remember later on | |
| console.log(window); // log the "window" object to the console |
Download the following files from Oracle
$VERSION-macosx-x64.zip$VERSION-macosx-x64.zipEdit ~/.zshrc or ~/.bashrc, add following:
http://geekgirl.io/concurrent-http-requests-with-python3-and-asyncio/
My friend who is a data scientist had wipped up a script that made lots (over 27K) of queries to the Google Places API. The problem was that it was synchronous and thus took over 2.5 hours to complete.
Given that I'm currently attending Hacker School and get to spend all day working on any coding problems that interests me, I decided to go about trying to optimise it.
I'm new to Python so had to do a bit of groundwork first to determine which course of action was best.
| """Comparison of fetching web pages sequentially vs. asynchronously | |
| Requirements: Python 3.5+, Requests, aiohttp, cchardet | |
| For a walkthrough see this blog post: | |
| http://mahugh.com/2017/05/23/http-requests-asyncio-aiohttp-vs-requests/ | |
| """ | |
| import asyncio | |
| from timeit import default_timer |