Skip to content

Instantly share code, notes, and snippets.

@kmike
Created January 21, 2016 09:51
Show Gist options
  • Save kmike/282dc84facc492dff885 to your computer and use it in GitHub Desktop.
Save kmike/282dc84facc492dff885 to your computer and use it in GitHub Desktop.
Using worker: worker-linux-docker-4b82c5b2.prod.travis-ci.org:travis-linux-4
travis_fold:start:system_info
Build system information
Build language: python
Build group: stable
Build dist: precise
Build image provisioning date and time
Thu Feb 5 15:09:33 UTC 2015
Operating System Details
Distributor ID: Ubuntu
Description: Ubuntu 12.04.5 LTS
Release: 12.04
Codename: precise
Linux Version
3.13.0-29-generic
Cookbooks Version
a68419e https://github.com/travis-ci/travis-cookbooks/tree/a68419e
GCC version
gcc (Ubuntu/Linaro 4.6.3-1ubuntu5) 4.6.3
Copyright (C) 2011 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
LLVM version
clang version 3.4 (tags/RELEASE_34/final)
Target: x86_64-unknown-linux-gnu
Thread model: posix
Pre-installed Ruby versions
ruby-1.9.3-p551
Pre-installed Node.js versions
v0.10.36
Pre-installed Go versions
1.4.1
Redis version
redis-server 2.8.19
riak version
2.0.2
MongoDB version
MongoDB 2.4.12
CouchDB version
couchdb 1.6.1
Neo4j version
1.9.4
RabbitMQ Version
3.4.3
ElasticSearch version
1.4.0
Installed Sphinx versions
2.0.10
2.1.9
2.2.6
Default Sphinx version
2.2.6
Installed Firefox version
firefox 31.0esr
PhantomJS version
1.9.8
ant -version
Apache Ant(TM) version 1.8.2 compiled on December 3 2011
mvn -version
Apache Maven 3.2.5 (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T17:29:23+00:00)
Maven home: /usr/local/maven
Java version: 1.7.0_76, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-7-oracle/jre
Default locale: en_US, platform encoding: ANSI_X3.4-1968
OS name: "linux", version: "3.13.0-29-generic", arch: "amd64", family: "unix"
travis_fold:end:system_info

3.5 is not installed; attempting download
travis_fold:start:git.checkout
travis_time:start:0a7e65bd
$ git clone --depth=50 https://github.com/scrapy/scrapy.git scrapy/scrapy
Cloning into 'scrapy/scrapy'...
remote: Counting objects: 5773, done.
remote: Compressing objects: 0% (1/2775) 
remote: Compressing objects: 1% (28/2775) 
remote: Compressing objects: 2% (56/2775) 
remote: Compressing objects: 3% (84/2775) 
remote: Compressing objects: 4% (111/2775) 
remote: Compressing objects: 5% (139/2775) 
remote: Compressing objects: 6% (167/2775) 
remote: Compressing objects: 7% (195/2775) 
remote: Compressing objects: 8% (222/2775) 
remote: Compressing objects: 9% (250/2775) 
remote: Compressing objects: 10% (278/2775) 
remote: Compressing objects: 11% (306/2775) 
remote: Compressing objects: 12% (333/2775) 
remote: Compressing objects: 13% (361/2775) 
remote: Compressing objects: 14% (389/2775) 
remote: Compressing objects: 15% (417/2775) 
remote: Compressing objects: 16% (444/2775) 
remote: Compressing objects: 17% (472/2775) 
remote: Compressing objects: 18% (500/2775) 
remote: Compressing objects: 19% (528/2775) 
remote: Compressing objects: 20% (555/2775) 
remote: Compressing objects: 21% (583/2775) 
remote: Compressing objects: 22% (611/2775) 
remote: Compressing objects: 23% (639/2775) 
remote: Compressing objects: 24% (666/2775) 
remote: Compressing objects: 25% (694/2775) 
remote: Compressing objects: 26% (722/2775) 
remote: Compressing objects: 27% (750/2775) 
remote: Compressing objects: 28% (777/2775) 
remote: Compressing objects: 29% (805/2775) 
remote: Compressing objects: 30% (833/2775) 
remote: Compressing objects: 31% (861/2775) 
remote: Compressing objects: 32% (888/2775) 
remote: Compressing objects: 33% (916/2775) 
remote: Compressing objects: 34% (944/2775) 
remote: Compressing objects: 35% (972/2775) 
remote: Compressing objects: 36% (999/2775) 
remote: Compressing objects: 37% (1027/2775) 
remote: Compressing objects: 38% (1055/2775) 
remote: Compressing objects: 39% (1083/2775) 
remote: Compressing objects: 40% (1110/2775) 
remote: Compressing objects: 41% (1138/2775) 
remote: Compressing objects: 42% (1166/2775) 
remote: Compressing objects: 43% (1194/2775) 
remote: Compressing objects: 44% (1221/2775) 
remote: Compressing objects: 45% (1249/2775) 
remote: Compressing objects: 46% (1277/2775) 
remote: Compressing objects: 47% (1305/2775) 
remote: Compressing objects: 48% (1332/2775) 
remote: Compressing objects: 49% (1360/2775) 
remote: Compressing objects: 50% (1388/2775) 
remote: Compressing objects: 51% (1416/2775) 
remote: Compressing objects: 52% (1443/2775) 
remote: Compressing objects: 53% (1471/2775) 
remote: Compressing objects: 54% (1499/2775) 
remote: Compressing objects: 55% (1527/2775) 
remote: Compressing objects: 56% (1554/2775) 
remote: Compressing objects: 57% (1582/2775) 
remote: Compressing objects: 58% (1610/2775) 
remote: Compressing objects: 59% (1638/2775) 
remote: Compressing objects: 60% (1665/2775) 
remote: Compressing objects: 61% (1693/2775) 
remote: Compressing objects: 62% (1721/2775) 
remote: Compressing objects: 63% (1749/2775) 
remote: Compressing objects: 64% (1776/2775) 
remote: Compressing objects: 65% (1804/2775) 
remote: Compressing objects: 66% (1832/2775) 
remote: Compressing objects: 67% (1860/2775) 
remote: Compressing objects: 68% (1887/2775) 
remote: Compressing objects: 69% (1915/2775) 
remote: Compressing objects: 70% (1943/2775) 
remote: Compressing objects: 71% (1971/2775) 
remote: Compressing objects: 72% (1998/2775) 
remote: Compressing objects: 73% (2026/2775) 
remote: Compressing objects: 74% (2054/2775) 
remote: Compressing objects: 75% (2082/2775) 
remote: Compressing objects: 76% (2109/2775) 
remote: Compressing objects: 77% (2137/2775) 
remote: Compressing objects: 78% (2165/2775) 
remote: Compressing objects: 79% (2193/2775) 
remote: Compressing objects: 80% (2220/2775) 
remote: Compressing objects: 81% (2248/2775) 
remote: Compressing objects: 82% (2276/2775) 
remote: Compressing objects: 83% (2304/2775) 
remote: Compressing objects: 84% (2331/2775) 
remote: Compressing objects: 85% (2359/2775) 
remote: Compressing objects: 86% (2387/2775) 
remote: Compressing objects: 87% (2415/2775) 
remote: Compressing objects: 88% (2442/2775) 
remote: Compressing objects: 89% (2470/2775) 
remote: Compressing objects: 90% (2498/2775) 
remote: Compressing objects: 91% (2526/2775) 
remote: Compressing objects: 92% (2553/2775) 
remote: Compressing objects: 93% (2581/2775) 
remote: Compressing objects: 94% (2609/2775) 
remote: Compressing objects: 95% (2637/2775) 
remote: Compressing objects: 96% (2664/2775) 
remote: Compressing objects: 97% (2692/2775) 
remote: Compressing objects: 98% (2720/2775) 
remote: Compressing objects: 99% (2748/2775) 
remote: Compressing objects: 100% (2775/2775) 
remote: Compressing objects: 100% (2775/2775), done.
Receiving objects: 0% (1/5773)
Receiving objects: 1% (58/5773)
Receiving objects: 2% (116/5773)
Receiving objects: 3% (174/5773)
Receiving objects: 4% (231/5773)
Receiving objects: 5% (289/5773)
Receiving objects: 6% (347/5773)
Receiving objects: 7% (405/5773)
Receiving objects: 8% (462/5773)
Receiving objects: 9% (520/5773)
Receiving objects: 10% (578/5773)
Receiving objects: 11% (636/5773)
Receiving objects: 12% (693/5773)
Receiving objects: 13% (751/5773)
Receiving objects: 14% (809/5773)
Receiving objects: 15% (866/5773)
Receiving objects: 16% (924/5773)
Receiving objects: 17% (982/5773)
Receiving objects: 18% (1040/5773)
Receiving objects: 19% (1097/5773)
Receiving objects: 20% (1155/5773)
Receiving objects: 21% (1213/5773)
Receiving objects: 22% (1271/5773)
Receiving objects: 23% (1328/5773)
Receiving objects: 24% (1386/5773)
Receiving objects: 25% (1444/5773)
Receiving objects: 26% (1501/5773)
Receiving objects: 27% (1559/5773), 364.00 KiB | 696.00 KiB/s
Receiving objects: 28% (1617/5773), 364.00 KiB | 696.00 KiB/s
Receiving objects: 29% (1675/5773), 364.00 KiB | 696.00 KiB/s
Receiving objects: 29% (1685/5773), 652.00 KiB | 625.00 KiB/s
Receiving objects: 30% (1732/5773), 652.00 KiB | 625.00 KiB/s
Receiving objects: 31% (1790/5773), 652.00 KiB | 625.00 KiB/s
Receiving objects: 32% (1848/5773), 652.00 KiB | 625.00 KiB/s
Receiving objects: 33% (1906/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 34% (1963/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 35% (2021/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 36% (2079/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 37% (2137/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 38% (2194/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 39% (2252/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 40% (2310/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 41% (2367/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 42% (2425/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 43% (2483/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 44% (2541/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 45% (2598/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 46% (2656/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 47% (2714/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 48% (2772/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 49% (2829/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 50% (2887/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 51% (2945/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 51% (2957/5773), 972.00 KiB | 620.00 KiB/s
Receiving objects: 52% (3002/5773), 1.28 MiB | 627.00 KiB/s
Receiving objects: 53% (3060/5773), 1.28 MiB | 627.00 KiB/s
Receiving objects: 54% (3118/5773), 1.28 MiB | 627.00 KiB/s
Receiving objects: 55% (3176/5773), 1.28 MiB | 627.00 KiB/s
Receiving objects: 56% (3233/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 57% (3291/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 58% (3349/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 59% (3407/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 60% (3464/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 61% (3522/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 62% (3580/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 63% (3637/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 64% (3695/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 65% (3753/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 66% (3811/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 67% (3868/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 68% (3926/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 68% (3958/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 69% (3984/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 70% (4042/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 71% (4099/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 72% (4157/5773), 1.42 MiB | 557.00 KiB/s
Receiving objects: 73% (4215/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 74% (4273/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 75% (4330/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 76% (4388/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 77% (4446/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 78% (4503/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 79% (4561/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 80% (4619/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 81% (4677/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 82% (4734/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 83% (4792/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 84% (4850/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 85% (4908/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 86% (4965/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 87% (5023/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 88% (5081/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 89% (5138/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 90% (5196/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 91% (5254/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 92% (5312/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 93% (5369/5773), 1.61 MiB | 512.00 KiB/s
Receiving objects: 93% (5412/5773), 1.79 MiB | 472.00 KiB/s
Receiving objects: 94% (5427/5773), 1.79 MiB | 472.00 KiB/s
Receiving objects: 95% (5485/5773), 1.79 MiB | 472.00 KiB/s
Receiving objects: 96% (5543/5773), 1.79 MiB | 472.00 KiB/s
Receiving objects: 97% (5600/5773), 1.79 MiB | 472.00 KiB/s
Receiving objects: 98% (5658/5773), 1.79 MiB | 472.00 KiB/s
Receiving objects: 99% (5716/5773), 1.79 MiB | 472.00 KiB/s
remote: Total 5773 (delta 4054), reused 4421 (delta 2969), pack-reused 0
Receiving objects: 100% (5773/5773), 1.79 MiB | 472.00 KiB/s
Receiving objects: 100% (5773/5773), 1.86 MiB | 472.00 KiB/s, done.
Resolving deltas: 0% (0/4054)
Resolving deltas: 8% (339/4054)
Resolving deltas: 14% (568/4054)
Resolving deltas: 16% (666/4054)
Resolving deltas: 22% (895/4054)
Resolving deltas: 23% (957/4054)
Resolving deltas: 24% (981/4054)
Resolving deltas: 25% (1014/4054)
Resolving deltas: 26% (1056/4054)
Resolving deltas: 27% (1102/4054)
Resolving deltas: 28% (1136/4054)
Resolving deltas: 29% (1178/4054)
Resolving deltas: 30% (1217/4054)
Resolving deltas: 32% (1309/4054)
Resolving deltas: 33% (1378/4054)
Resolving deltas: 34% (1380/4054)
Resolving deltas: 35% (1423/4054)
Resolving deltas: 36% (1460/4054)
Resolving deltas: 37% (1506/4054)
Resolving deltas: 38% (1548/4054)
Resolving deltas: 39% (1596/4054)
Resolving deltas: 41% (1686/4054)
Resolving deltas: 42% (1706/4054)
Resolving deltas: 43% (1750/4054)
Resolving deltas: 44% (1784/4054)
Resolving deltas: 45% (1828/4054)
Resolving deltas: 46% (1884/4054)
Resolving deltas: 47% (1909/4054)
Resolving deltas: 48% (1953/4054)
Resolving deltas: 49% (1997/4054)
Resolving deltas: 51% (2069/4054)
Resolving deltas: 58% (2363/4054)
Resolving deltas: 59% (2394/4054)
Resolving deltas: 60% (2434/4054)
Resolving deltas: 61% (2474/4054)
Resolving deltas: 62% (2514/4054)
Resolving deltas: 63% (2561/4054)
Resolving deltas: 64% (2596/4054)
Resolving deltas: 65% (2637/4054)
Resolving deltas: 66% (2694/4054)
Resolving deltas: 67% (2725/4054)
Resolving deltas: 68% (2765/4054)
Resolving deltas: 69% (2799/4054)
Resolving deltas: 70% (2839/4054)
Resolving deltas: 71% (2880/4054)
Resolving deltas: 72% (2921/4054)
Resolving deltas: 73% (2965/4054)
Resolving deltas: 74% (3009/4054)
Resolving deltas: 75% (3042/4054)
Resolving deltas: 76% (3083/4054)
Resolving deltas: 77% (3124/4054)
Resolving deltas: 78% (3163/4054)
Resolving deltas: 79% (3207/4054)
Resolving deltas: 80% (3250/4054)
Resolving deltas: 81% (3285/4054)
Resolving deltas: 82% (3328/4054)
Resolving deltas: 83% (3373/4054)
Resolving deltas: 84% (3414/4054)
Resolving deltas: 85% (3447/4054)
Resolving deltas: 86% (3488/4054)
Resolving deltas: 87% (3536/4054)
Resolving deltas: 88% (3588/4054)
Resolving deltas: 89% (3615/4054)
Resolving deltas: 90% (3657/4054)
Resolving deltas: 91% (3697/4054)
Resolving deltas: 92% (3732/4054)
Resolving deltas: 93% (3772/4054)
Resolving deltas: 94% (3817/4054)
Resolving deltas: 95% (3853/4054)
Resolving deltas: 96% (3892/4054)
Resolving deltas: 97% (3941/4054)
Resolving deltas: 98% (3973/4054)
Resolving deltas: 99% (4014/4054)
Resolving deltas: 100% (4054/4054)
Resolving deltas: 100% (4054/4054), done.
Checking connectivity... done.
travis_time:end:0a7e65bd:start=1453320326315793092,finish=1453320332092274126,duration=5776481034
$ cd scrapy/scrapy
travis_time:start:03635bc2
$ git fetch origin +refs/pull/1692/merge:
remote: Counting objects: 26, done.
remote: Compressing objects: 11% (1/9) 
remote: Compressing objects: 22% (2/9) 
remote: Compressing objects: 33% (3/9) 
remote: Compressing objects: 44% (4/9) 
remote: Compressing objects: 55% (5/9) 
remote: Compressing objects: 66% (6/9) 
remote: Compressing objects: 77% (7/9) 
remote: Compressing objects: 88% (8/9) 
remote: Compressing objects: 100% (9/9) 
remote: Compressing objects: 100% (9/9), done.
remote: Total 26 (delta 19), reused 23 (delta 17), pack-reused 0
Unpacking objects: 3% (1/26)
Unpacking objects: 7% (2/26)
Unpacking objects: 11% (3/26)
Unpacking objects: 15% (4/26)
Unpacking objects: 19% (5/26)
Unpacking objects: 23% (6/26)
Unpacking objects: 26% (7/26)
Unpacking objects: 30% (8/26)
Unpacking objects: 34% (9/26)
Unpacking objects: 38% (10/26)
Unpacking objects: 42% (11/26)
Unpacking objects: 46% (12/26)
Unpacking objects: 50% (13/26)
Unpacking objects: 53% (14/26)
Unpacking objects: 57% (15/26)
Unpacking objects: 61% (16/26)
Unpacking objects: 65% (17/26)
Unpacking objects: 69% (18/26)
Unpacking objects: 73% (19/26)
Unpacking objects: 76% (20/26)
Unpacking objects: 80% (21/26)
Unpacking objects: 84% (22/26)
Unpacking objects: 88% (23/26)
Unpacking objects: 92% (24/26)
Unpacking objects: 96% (25/26)
Unpacking objects: 100% (26/26)
Unpacking objects: 100% (26/26), done.
From https://github.com/scrapy/scrapy
* branch refs/pull/1692/merge -> FETCH_HEAD
travis_time:end:03635bc2:start=1453320332096001633,finish=1453320332481749107,duration=385747474
$ git checkout -qf FETCH_HEAD
travis_fold:end:git.checkout

This job is running on container-based infrastructure, which does not allow use of 'sudo', setuid and setguid executables.
If you require sudo, add 'sudo: required' to your .travis.yml
See https://docs.travis-ci.com/user/workers/container-based-infrastructure/ for details.
Setting environment variables from .travis.yml
$ export TOXENV=py33
travis_fold:start:cache.1
Setting up build cache
$ export CASHER_DIR=$HOME/.casher
travis_time:start:0e003a44
$ Installing caching utilities
travis_time:end:0e003a44:start=1453320334804450822,finish=1453320335241143539,duration=436692717
travis_time:start:02027484
travis_time:end:02027484:start=1453320335245490279,finish=1453320335248782998,duration=3292719
travis_time:start:12b74a0c
attempting to download cache archive
fetching PR.1692/cache--python-3.5.tgz
fetching PR.1692/cache--python-3.5.tbz
fetching master/cache--python-3.5.tgz
found cache
travis_time:end:12b74a0c:start=1453320335252329331,finish=1453320336220100398,duration=967771067
travis_time:start:054d2754
travis_time:end:054d2754:start=1453320336224341353,finish=1453320336227488248,duration=3146895
travis_time:start:2021afaf
adding /home/travis/.cache/pip to cache
travis_time:end:2021afaf:start=1453320336231070209,finish=1453320337608960266,duration=1377890057
travis_fold:end:cache.1
travis_time:start:0925da7b
$ source ~/virtualenv/python3.5/bin/activate
travis_time:end:0925da7b:start=1453320337612899804,finish=1453320337616566521,duration=3666717
$ python --version
Python 3.5.0
$ pip --version
pip 7.1.2 from /home/travis/virtualenv/python3.5.0/lib/python3.5/site-packages (python 3.5)
travis_fold:start:install
travis_time:start:11cf0d9c
$ pip install -U tox twine wheel codecov
Collecting tox
Using cached tox-2.3.1-py2.py3-none-any.whl
Collecting twine
Using cached twine-1.6.5-py2.py3-none-any.whl
Requirement already up-to-date: wheel in /home/travis/virtualenv/python3.5.0/lib/python3.5/site-packages
Collecting codecov
Using cached codecov-1.6.3-py2.py3-none-any.whl
Requirement already up-to-date: py>=1.4.17 in /home/travis/virtualenv/python3.5.0/lib/python3.5/site-packages (from tox)
Collecting pluggy<0.4.0,>=0.3.0 (from tox)
Using cached pluggy-0.3.1-py2.py3-none-any.whl
Collecting virtualenv>=1.11.2 (from tox)
Using cached virtualenv-14.0.0-py2.py3-none-any.whl
Collecting requests-toolbelt>=0.4.0 (from twine)
Using cached requests_toolbelt-0.5.1-py2.py3-none-any.whl
Collecting setuptools>=0.7.0 (from twine)
Using cached setuptools-19.4-py2.py3-none-any.whl
Collecting requests>=2.3.0 (from twine)
Using cached requests-2.9.1-py2.py3-none-any.whl
Collecting pkginfo>=1.0 (from twine)
Collecting coverage (from codecov)
Installing collected packages: pluggy, virtualenv, tox, requests, requests-toolbelt, setuptools, pkginfo, twine, coverage, codecov
Found existing installation: setuptools 12.0.5
Uninstalling setuptools-12.0.5:
Successfully uninstalled setuptools-12.0.5
Successfully installed codecov-1.6.3 coverage-4.0.3 pkginfo-1.2.1 pluggy-0.3.1 requests-2.9.1 requests-toolbelt-0.5.1 setuptools-19.4 tox-2.3.1 twine-1.6.5 virtualenv-14.0.0
You are using pip version 7.1.2, however version 8.0.0 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
travis_time:end:11cf0d9c:start=1453320337922135400,finish=1453320341939759166,duration=4017623766
travis_fold:end:install
travis_time:start:0038ca5e
$ tox
GLOB sdist-make: /home/travis/build/scrapy/scrapy/setup.py
py33 create: /home/travis/build/scrapy/scrapy/.tox/py33
py33 installdeps: -rrequirements-py3.txt, Pillow, -rtests/requirements-py3.txt
py33 inst: /home/travis/build/scrapy/scrapy/.tox/dist/Scrapy-1.1.0.dev1.zip
py33 installed: blessings==1.6,bpython==0.15,cffi==1.5.0,characteristic==14.3.0,coverage==4.0.3,cryptography==1.2.1,cssselect==0.9.1,curtsies==0.2.6,decorator==4.0.6,enum34==1.1.2,greenlet==0.4.9,idna==2.0,ipython==4.0.3,ipython-genutils==0.1.0,jmespath==0.9.0,leveldb==0.193,lxml==3.5.0,parsel==1.0.1,path.py==8.1.2,pexpect==4.0.1,pickleshare==0.6,Pillow==3.1.0,ptyprocess==0.5,py==1.4.31,pyasn1==0.1.9,pyasn1-modules==0.0.8,pycparser==2.14,PyDispatcher==2.0.5,Pygments==2.1,pyOpenSSL==0.15.1,pytest==2.7.3,pytest-cov==2.2.0,pytest-twisted==1.5,queuelib==1.4.2,requests==2.9.1,Scrapy==1.1.0.dev1,service-identity==14.0.0,simplegeneric==0.8.1,six==1.10.0,testfixtures==4.7.0,traitlets==4.1.0,Twisted==15.5.0,w3lib==1.13.0,wcwidth==0.1.6,wheel==0.26.0,zope.interface==4.1.3
py33 runtests: PYTHONHASHSEED='3697819909'
py33 runtests: commands[0] | py.test --cov=scrapy --cov-report= scrapy tests
============================= test session starts ==============================
platform linux -- Python 3.3.5 -- py-1.4.31 -- pytest-2.7.3
rootdir: /home/travis/build/scrapy/scrapy, inifile: pytest.ini
plugins: twisted, cov

collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 0 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 1 items
collecting 2 items
collecting 2 items
collecting 2 items
collecting 2 items
collecting 2 items
collecting 2 items
collecting 2 items
collecting 2 items
collecting 2 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 3 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 4 items
collecting 5 items
collecting 5 items
collecting 5 items
collecting 5 items
collecting 8 items
collecting 8 items
collecting 8 items
collecting 8 items
collecting 9 items
collecting 9 items
collecting 9 items
collecting 9 items
collecting 9 items
collecting 10 items
collecting 10 items
collecting 10 items
collecting 10 items
collecting 11 items
collecting 11 items
collecting 11 items
collecting 11 items
collecting 11 items
collecting 11 items
collecting 11 items
collecting 11 items
collecting 11 items
collecting 15 items
collecting 15 items
collecting 15 items
collecting 17 items
collecting 17 items
collecting 17 items
collecting 17 items
collecting 18 items
collecting 19 items
collecting 19 items
collecting 27 items
collecting 28 items
collecting 32 items
collecting 35 items
collecting 36 items
collecting 36 items
collecting 36 items
collecting 39 items
collecting 39 items
collecting 39 items
collecting 39 items
collecting 58 items
collecting 58 items
collecting 58 items
collecting 58 items
collecting 62 items
collecting 63 items
collecting 68 items
collecting 70 items
collecting 70 items
collecting 70 items
collecting 71 items
collecting 71 items
collecting 71 items
collecting 74 items
collecting 76 items
collecting 84 items
collecting 92 items
collecting 100 items
collecting 108 items
collecting 121 items
collecting 134 items
collecting 137 items
collecting 140 items
collecting 140 items
collecting 143 items
collecting 147 items
collecting 148 items
collecting 155 items
collecting 160 items
collecting 160 items
collecting 160 items
collecting 160 items
collecting 160 items
collecting 163 items
collecting 164 items
collecting 164 items
collecting 164 items
collecting 169 items
collecting 169 items
collecting 169 items
collecting 181 items
collecting 181 items
collecting 181 items
collecting 181 items
collecting 184 items
collecting 184 items
collecting 184 items
collecting 184 items
collecting 186 items
collecting 186 items
collecting 186 items
collecting 186 items
collecting 190 items
collecting 190 items
collecting 190 items
collecting 192 items
collecting 192 items
collecting 192 items
collecting 193 items
collecting 196 items
collecting 199 items
collecting 203 items
collecting 206 items
collecting 209 items
collecting 212 items
collecting 218 items
collecting 227 items
collecting 227 items
collecting 227 items
collecting 236 items
collecting 236 items
collecting 236 items
collecting 236 items
collecting 248 items
collecting 255 items
collecting 255 items
collecting 255 items
collecting 263 items
collecting 263 items
collecting 263 items
collecting 266 items
collecting 266 items
collecting 266 items
collecting 266 items
collecting 271 items
collecting 271 items
collecting 271 items
collecting 271 items
collecting 274 items
collecting 274 items
collecting 274 items
collecting 279 items
collecting 280 items
collecting 281 items
collecting 282 items
collecting 286 items
collecting 286 items
collecting 286 items
collecting 296 items
collecting 299 items
collecting 299 items
collecting 299 items
collecting 299 items
collecting 316 items
collecting 316 items
collecting 316 items
collecting 330 items
collecting 393 items
collecting 409 items
collecting 409 items
collecting 409 items
collecting 416 items
collecting 434 items
collecting 454 items
collecting 474 items
collecting 474 items
collecting 474 items
collecting 490 items
collecting 490 items
collecting 490 items
collecting 493 items
collecting 493 items
collecting 493 items
collecting 519 items
collecting 519 items
collecting 519 items
collecting 548 items
collecting 553 items
collecting 569 items
collecting 574 items
collecting 575 items
collecting 575 items
collecting 575 items
collecting 578 items
collecting 578 items
collecting 578 items
collecting 582 items
collecting 582 items
collecting 582 items
collecting 588 items
collecting 602 items
collecting 602 items
collecting 602 items
collecting 609 items
collecting 609 items
collecting 609 items
collecting 619 items
collecting 622 items
collecting 622 items
collecting 622 items
collecting 623 items
collecting 623 items
collecting 623 items
collecting 634 items
collecting 645 items
collecting 657 items
collecting 668 items
collecting 684 items
collecting 699 items
collecting 703 items
collecting 703 items
collecting 703 items
collecting 704 items
collecting 704 items
collecting 704 items
collecting 704 items
collecting 705 items
collecting 706 items
collecting 707 items
collecting 708 items
collecting 708 items
collecting 708 items
collecting 708 items
collecting 709 items
collecting 709 items
collecting 709 items
collecting 709 items
collecting 710 items
collecting 710 items
collecting 710 items
collecting 710 items
collecting 712 items
collecting 712 items
collecting 712 items
collecting 725 items
collecting 738 items
collecting 751 items
collecting 764 items
collecting 777 items
collecting 793 items
collecting 809 items
collecting 825 items
collecting 841 items
collecting 857 items
collecting 870 items
collecting 886 items
collecting 886 items
collecting 886 items
collecting 888 items
collecting 888 items
collecting 888 items
collecting 894 items
collecting 894 items
collecting 894 items
collecting 894 items
collecting 895 items
collecting 895 items
collecting 895 items
collecting 903 items
collecting 904 items
collecting 904 items
collecting 904 items
collecting 907 items
collecting 907 items
collecting 907 items
collecting 918 items
collecting 918 items
collecting 918 items
collecting 920 items
collecting 924 items
collecting 926 items
collecting 926 items
collecting 926 items
collecting 939 items
collecting 942 items
collecting 942 items
collecting 942 items
collecting 946 items
collecting 946 items
collecting 946 items
collecting 947 items
collecting 947 items
collecting 947 items
collecting 948 items
collecting 948 items
collecting 948 items
collecting 956 items
collecting 966 items
collecting 975 items
collecting 976 items
collecting 976 items
collecting 976 items
collecting 978 items
collecting 982 items
collecting 985 items
collecting 986 items
collecting 986 items
collecting 986 items
collecting 991 items
collecting 996 items
collecting 997 items
collecting 1001 items
collecting 1008 items
collecting 1008 items
collecting 1008 items
collecting 1015 items
collecting 1015 items
collecting 1015 items
collecting 1019 items
collecting 1019 items
collecting 1019 items
collecting 1023 items
collecting 1023 items
collecting 1023 items
collecting 1027 items
collecting 1027 items
collecting 1027 items
collecting 1028 items
collecting 1029 items
collecting 1030 items
collecting 1031 items
collecting 1031 items
collecting 1031 items
collecting 1041 items
collecting 1041 items
collecting 1041 items
collecting 1043 items
collecting 1043 items
collecting 1043 items
collecting 1048 items
collecting 1048 items
collecting 1048 items
collecting 1053 items
collecting 1070 items
collecting 1096 items
collecting 1096 items
collecting 1096 items
collecting 1098 items
collecting 1100 items
collecting 1110 items
collecting 1110 items
collecting 1110 items
collecting 1110 items
collecting 1110 items
collecting 1110 items
collecting 1115 items
collecting 1115 items
collecting 1115 items
collecting 1115 items
collecting 1115 items
collecting 1116 items
collecting 1121 items
collecting 1143 items
collecting 1147 items
collecting 1148 items
collecting 1148 items
collecting 1148 items
collecting 1148 items
collecting 1155 items
collecting 1155 items
collecting 1155 items
collecting 1155 items
collecting 1155 items
collecting 1155 items
collecting 1155 items
collecting 1155 items
collecting 1155 items
collecting 1159 items
collecting 1159 items
collecting 1159 items
collecting 1159 items
collecting 1159 items
collecting 1159 items
collecting 1159 items
collecting 1159 items
collected 1159 items

scrapy/downloadermiddlewares/ajaxcrawl.py .
scrapy/extensions/httpcache.py .
scrapy/http/cookies.py .
scrapy/utils/datatypes.py .
scrapy/utils/misc.py .
scrapy/utils/python.py ...
scrapy/utils/response.py .
scrapy/utils/template.py .
scrapy/utils/url.py .
tests/test_closespider.py ....
tests/test_command_version.py ..
tests/test_commands.py ...............sssF
tests/test_contracts.py ...
tests/test_crawl.py FEFEFEFEFE.FEFEFE.FE.....F..
tests/test_crawler.py ............
tests/test_dependencies.py .
tests/test_downloader_handlers.py ....................................s.........................s.s...........sssssssssssss
tests/test_downloadermiddleware.py ....
tests/test_downloadermiddleware_ajaxcrawlable.py .....
tests/test_downloadermiddleware_cookies.py ............
tests/test_downloadermiddleware_decompression.py ...
tests/test_downloadermiddleware_defaultheaders.py ..
tests/test_downloadermiddleware_downloadtimeout.py ....
tests/test_downloadermiddleware_httpauth.py ..
tests/test_downloadermiddleware_httpcache.py ...................................
tests/test_downloadermiddleware_httpcompression.py .........
tests/test_downloadermiddleware_redirect.py ...................
tests/test_downloadermiddleware_robotstxt.py ........
tests/test_downloadermiddleware_stats.py ...
tests/test_downloadermiddleware_useragent.py .....
tests/test_dupefilters.py ...
tests/test_feedexport.py .....ss.ssss
tests/test_http_cookies.py .............
tests/test_http_headers.py .................
tests/test_http_request.py ...........s.......s......................................................s..........s...s...
tests/test_http_response.py .................................................................
tests/test_item.py ................
tests/test_link.py ...
tests/test_linkextractors.py .....................X....
tests/test_loader.py ........................................................
tests/test_logformatter.py ...
tests/test_middleware.py ....
tests/test_pipeline_media.py ....................
tests/test_responsetypes.py .......
tests/test_selector.py .............
tests/test_selector_csstranslator.py .
tests/test_spider.py ................................................................................
tests/test_spidermiddleware_depth.py .
tests/test_spidermiddleware_offsite.py ....
tests/test_spidermiddleware_referer.py .
tests/test_spidermiddleware_urllength.py .
tests/test_spiderstate.py ..
tests/test_squeues.py ......x............x............x............x............x............x...............x...............x...............x...............x...............x............x.........
tests/test_stats.py ..
tests/test_toplevel.py ......
tests/test_urlparse_monkeypatches.py .
tests/test_utils_conf.py .........
tests/test_utils_console.py ...
tests/test_utils_datatypes.py ...........
tests/test_utils_defer.py ........
tests/test_utils_deprecate.py ................
tests/test_utils_gz.py ....
tests/test_utils_http.py .
tests/test_utils_httpobj.py .
tests/test_utils_iterators.py ............................
tests/test_utils_log.py ..........
tests/test_utils_python.py .................sss..
tests/test_utils_reqser.py .......
tests/test_utils_request.py ....
tests/test_utils_response.py ....
tests/test_utils_serialize.py ....
tests/test_utils_signal.py ....
tests/test_utils_sitemap.py ..........
tests/test_utils_spider.py ..
tests/test_utils_trackref.py .....
tests/test_utils_url.py ............ss...s..............................
tests/test_webclient.py .s............
tests/test_cmdline/__init__.py .....
tests/test_settings/__init__.py .................................
tests/test_spiderloader/__init__.py .......
tests/test_utils_misc/__init__.py ....
==================================== ERRORS ====================================
____________ ERROR at teardown of CrawlTestCase.test_crawl_multiple ____________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0cd63810 [4.782226800918579s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0cd77050 [4.798213958740234s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0cd77090 [59.797582387924194s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7f4b0cd75590>,), **{})()>
<DelayedCall 0x7f4b0cdef310 [59.79576277732849s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cd639d0 [59.802011489868164s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0cdef450 [59.80085206031799s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0cd76ad0 [59.77935767173767s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cd63250 [59.78141760826111s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7f4b0e64c550>,), **{})()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
____ ERROR at teardown of CrawlTestCase.test_crawlerrunner_accepts_crawler _____
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0e68dc10 [4.7963316440582275s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0e62f610 [59.793455839157104s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cd17fd0 [59.79858446121216s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0e6c7990 [59.79543375968933s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7f4b0cd16690>,), **{})()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
________________ ERROR at teardown of CrawlTestCase.test_delay _________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0cd27710 [4.797883033752441s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0cd26d10 [59.79499101638794s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cd261d0 [59.8002450466156s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0cd27590 [59.79709434509277s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<FollowAllSpider 'follow' at 0x7f4b0cd22410>,), **{})()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
____________ ERROR at teardown of CrawlTestCase.test_engine_status _____________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0cd33dd0 [4.795856952667236s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0cd2ddd0 [59.79304075241089s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cd33250 [59.79828715324402s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0cd33110 [59.79508185386658s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SingleRequestSpider 'meta' at 0x7f4b0cd32e90>,), **{})()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
______________ ERROR at teardown of CrawlTestCase.test_follow_all ______________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0cd44a50 [4.797483921051025s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0cd43d50 [59.794649600982666s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cd43e50 [59.79982376098633s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0cd448d0 [59.79669737815857s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<FollowAllSpider 'follow' at 0x7f4b0cd3f590>,), **{})()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
____________ ERROR at teardown of CrawlTestCase.test_referer_header ____________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0ccf5250 [4.796038627624512s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0ccef450 [59.793062686920166s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0ccef610 [59.79822301864624s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0ccf50d0 [59.79523587226868s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SingleRequestSpider 'meta' at 0x7f4b0ccebbd0>,), **{})()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
______________ ERROR at teardown of CrawlTestCase.test_retry_503 _______________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0cd0e150 [4.798213958740234s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0cd08d10 [59.79548096656799s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cd0e190 [59.79745411872864s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7f4b0cd01b10>,), **{})()>
<DelayedCall 0x7f4b0cd0e290 [59.801377058029175s] called=0 cancelled=0 Deferred.cancel()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
__________ ERROR at teardown of CrawlTestCase.test_retry_conn_aborted __________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0cca9a90 [4.797102689743042s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0cca8ed0 [59.79436731338501s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cca8fd0 [59.79943084716797s] called=0 cancelled=0 Deferred.cancel()>
<DelayedCall 0x7f4b0cca99d0 [59.79634952545166s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7f4b0cca1690>,), **{})()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
___________ ERROR at teardown of CrawlTestCase.test_retry_conn_lost ____________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7f4b0cc67590 [4.798602104187012s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7f4b0ccbd610 [59.79586625099182s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7f4b0cc67f90 [59.79785466194153s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7f4b0cccf6d0>,), **{})()>
<DelayedCall 0x7f4b0cc67b90 [59.80086016654968s] called=0 cancelled=0 Deferred.cancel()>
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
=================================== FAILURES ===================================
__________________________ BenchCommandTest.test_run ___________________________
self = <tests.test_commands.BenchCommandTest testMethod=test_run>
 def test_run(self):
 p = self.proc('bench', '-s', 'LOGSTATS_INTERVAL=0.001',
> '-s', 'CLOSESPIDER_TIMEOUT=0.01')
/home/travis/build/scrapy/scrapy/tests/test_commands.py:276:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <tests.test_commands.BenchCommandTest testMethod=test_run>
new_args = ('bench', '-s', 'LOGSTATS_INTERVAL=0.001', '-s', 'CLOSESPIDER_TIMEOUT=0.01')
kwargs = {}
args = ('/home/travis/build/scrapy/scrapy/.tox/py33/bin/python3.3', '-m', 'scrapy.cmdline', 'bench', '-s', 'LOGSTATS_INTERVAL=0.001', ...)
p = <subprocess.Popen object at 0x7f4b0cdea210>, waited = 15.199999999999978
interval = 0.2
 def proc(self, *new_args, **kwargs):
 args = (sys.executable, '-m', 'scrapy.cmdline') + new_args
 p = subprocess.Popen(args, cwd=self.cwd, env=self.env,
 stdout=subprocess.PIPE, stderr=subprocess.PIPE,
 **kwargs)
 
 waited = 0
 interval = 0.2
 while p.poll() is None:
 sleep(interval)
 waited += interval
 if waited > 15:
 p.kill()
> assert False, 'Command took too much time to complete'
E AssertionError: Command took too much time to complete
/home/travis/build/scrapy/scrapy/tests/test_commands.py:54: AssertionError
______________________ CrawlTestCase.test_crawl_multiple _______________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_crawl_multiple> (test_crawl_multiple) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
_______________ CrawlTestCase.test_crawlerrunner_accepts_crawler _______________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_crawlerrunner_accepts_crawler> (test_crawlerrunner_accepts_crawler) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
___________________________ CrawlTestCase.test_delay ___________________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_delay> (test_delay) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
_______________________ CrawlTestCase.test_engine_status _______________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_engine_status> (test_engine_status) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
________________________ CrawlTestCase.test_follow_all _________________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_follow_all> (test_follow_all) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
______________________ CrawlTestCase.test_referer_header _______________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_referer_header> (test_referer_header) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
_________________________ CrawlTestCase.test_retry_503 _________________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_retry_503> (test_retry_503) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
____________________ CrawlTestCase.test_retry_conn_aborted _____________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_retry_conn_aborted> (test_retry_conn_aborted) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
______________________ CrawlTestCase.test_retry_conn_lost ______________________
NOTE: Incompatible Exception Representation, displaying natively:
twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_retry_conn_lost> (test_retry_conn_lost) still running at 120.0 secs
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
______________________ CrawlTestCase.test_timeout_failure ______________________
self = <tests.test_crawl.CrawlTestCase testMethod=test_timeout_failure>
 @defer.inlineCallbacks
 def test_timeout_failure(self):
 crawler = CrawlerRunner({"DOWNLOAD_TIMEOUT": 0.35}).create_crawler(DelaySpider)
 yield crawler.crawl(n=0.5)
 self.assertTrue(crawler.spider.t1 > 0)
> self.assertTrue(crawler.spider.t2 == 0)
/home/travis/build/scrapy/scrapy/tests/test_crawl.py:68:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/trial/_synctest.py:400: in assertTrue
 super(_Assertions, self).assertTrue(condition, msg)
E twisted.trial.unittest.FailTest: False is not true
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 982, in startListening
skt.bind(addr)
OSError: [Errno 98] Address already in use
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 160, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/opt/python/3.3.5/lib/python3.3/runpy.py", line 73, in _run_code
exec(code, run_globals)
File "/home/travis/build/scrapy/scrapy/tests/mockserver.py", line 217, in <module>
httpPort = reactor.listenTCP(8998, factory)
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/posixbase.py", line 478, in listenTCP
p.startListening()
File "/home/travis/build/scrapy/scrapy/.tox/py33/lib/python3.3/site-packages/twisted/internet/tcp.py", line 984, in startListening
raise CannotListenError(self.interface, self.port, le)
twisted.internet.error.CannotListenError: Couldn't listen on any:8998: [Errno 98] Address already in use.
 11 failed, 1098 passed, 37 skipped, 12 xfailed, 1 xpassed, 9 error in 1236.26 seconds 
ERROR: InvocationError: '/home/travis/build/scrapy/scrapy/.tox/py33/bin/py.test --cov=scrapy --cov-report= scrapy tests'
___________________________________ summary ____________________________________
ERROR: py33: commands failed
travis_time:end:0038ca5e:start=1453320341943596217,finish=1453321593796370741,duration=1251852774524

The command "tox" exited with 1.
travis_fold:start:cache.2
store build cache
travis_time:start:125909b8
travis_time:end:125909b8:start=1453321593800489016,finish=1453321593803724762,duration=3235746
travis_time:start:112c58f8
change detected:
/home/travis/.cache/pip/http/0/4/6/f/9/046f985ef0a92e4e2efc37c8ac47893253db65ee45ba1ad49e48091f
/home/travis/.cache/pip/http/0/8/d/c/e/08dcef3be0bb9cce46e42c11d622ed3fea0797e37de9ffac3f5d70d7
/home/travis/.cache/pip/http/0/d/e/2/d/0de2def333d81c5bf5211ffa60584b36f6d465a6a3eff4c58e8a1bd5
/home/travis/.cache/pip/http/1/6/6/b/1/166b1009731eb4aa4f196ee1e6012b0d1a320e0773fc5b8fbaa9ba22
/home/travis/.cache/pip/http/1/d/4/f/9/1d4f95bd7275300d9266d91f0d0e840f30b39dd784f2550cb4bc8a3d
/home/travis/.cache/pip/http/2/a/e/c/a/2aeca392866bf3345843918b645c0f8d7c3525a880d24e94cd10b90c
/home/travis/.cache/pip/http/2/f/6/0/e/2f60ee2e6b2f8d9d74debb0760385405e98f6d110bddba8ef79ec89c
/home/travis/.cache/pip/http/3/9/f/9/d/39f9d946878582a5680805f7f54f0f366515e7b6cb1703fd535e9efe
/home/travis/.cache/pip/http/3/a/f/3/a/3af3addf06e983a6c02f46e7bea70c221d3ff95bf1418fa6da354e14
/home/travis/.cache/pip/http/3/a/f/d/a/3afda669abfb869928f19ceb88754777348a8406f52194a3da978380
/home/travis/.cache/pip/http/3/d/0/7/9/3d

...

changes detected, packing new archive
uploading archive
travis_time:end:112c58f8:start=1453321593807407207,finish=1453321597853975302,duration=4046568095
travis_fold:end:cache.2

Done. Your build exited with 1.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment