Dataset Viewer
Auto-converted to Parquet Duplicate
id
uint32
deleted
uint8
type
int8
by
string
time
timestamp[ms, tz=UTC]
text
string
dead
uint8
parent
uint32
poll
uint32
kids
list
url
string
score
int32
title
string
parts
list
descendants
int32
words
list
1
0
1
pg
2006-10-09T18:21:51
0
0
0
[ 15, 234509, 487171, 454426, 454424, 454410, 82729 ]
http://ycombinator.com
57
Y Combinator
[]
15
[]
2
0
1
phyllis
2006-10-09T18:30:28
0
0
0
[ 454411 ]
http://www.paulgraham.com/mit.html
16
A Student's Guide to Startups
[]
0
[]
3
0
1
phyllis
2006-10-09T18:40:33
0
0
0
[ 454412, 531602 ]
http://www.foundersatwork.com/stevewozniak.html
7
Woz Interview: the early days of Apple
[]
0
[]
4
0
1
onebeerdave
2006-10-09T18:47:42
0
0
0
[ 454413 ]
http://avc.blogs.com/a_vc/2006/10/the_nyc_develop.html
5
NYC Developer Dilemma
[]
0
[]
5
0
1
perler
2006-10-09T18:51:04
0
0
0
[ 454414 ]
http://www.techcrunch.com/2006/10/09/google-youtube-sign-more-separate-deals/
7
Google, YouTube acquisition announcement could come tonight
[]
0
[]
6
0
1
perler
2006-10-09T18:56:40
0
0
0
[ 454415 ]
http://360techblog.com/2006/10/02/business-intelligence-the-inkling-way/
4
Business Intelligence the Inkling Way: cool prediction markets software
[]
0
[]
7
0
1
phyllis
2006-10-09T19:00:55
0
0
0
[ 454416 ]
http://featured.gigaom.com/2006/10/09/sevin-rosen-unfunds-why/
5
Sevin Rosen Unfunds - why?
[]
0
[]
8
0
1
frobnicate
2006-10-09T19:17:39
0
0
0
[ 454417 ]
http://news.bbc.co.uk/2/hi/programmes/click_online/5412216.stm
10
LikeBetter featured by BBC
[]
0
[]
9
0
1
askjigga
2006-10-09T19:19:02
0
0
0
[ 454418 ]
http://www.weekendr.com/
4
weekendr: social network for the weekend
[]
0
[]
10
0
1
frobnicate
2006-10-09T19:21:14
0
0
0
[ 454419 ]
http://www.techcrunch.com/2006/10/09/broadcast-photos-to-cable-tv/
3
PhotoShow: Broadcast Photos to Cable TV
[]
0
[]
11
0
1
frobnicate
2006-10-09T19:24:44
0
0
0
[ 454420 ]
http://www.useit.com/alertbox/participation_inequality.html
5
Participation Inequality: Encouraging More Users to Contribute
[]
0
[]
12
0
1
farmer
2006-10-09T19:28:32
0
0
0
[ 454421 ]
http://wired.com/wired/archive/14.10/cloudware.html
5
Wired: The Desktop is Dead
[]
0
[]
13
0
1
phyllis
2006-10-09T19:36:12
0
0
0
[ 454422 ]
http://www.paulgraham.com/startuplessons.html
5
The Hardest Lessons for Startups to Learn
[]
0
[]
14
0
1
pg
2006-10-09T19:50:20
0
0
0
[ 454423 ]
http://blogs.zdnet.com/BTL/?p=3738
4
Small is Beautiful: Building a Successful Company with Less Capital
[]
0
[]
15
0
2
sama
2006-10-09T19:51:01
"the rising star of venture capital" -unknown VC eating lunch on SHR
0
1
0
[ 17 ]
0
[]
0
[ "capital", "eating", "lunch", "of", "on", "rising", "shr", "star", "the", "unknown", "vc", "venture" ]
16
0
1
pg
2006-10-09T19:51:43
0
0
0
[ 454425 ]
http://www.feld.com/blog/archives/001979.html
11
Feld: Question Regarding NDAs
[]
0
[]
17
0
2
pg
2006-10-09T19:52:45
Is there anywhere to eat on Sandhill Road?
0
15
0
[ 1079 ]
0
[]
0
[ "anywhere", "eat", "is", "on", "road", "sandhill", "there", "to" ]
18
0
1
farmer
2006-10-09T19:55:46
0
0
0
[ 454427 ]
http://www.thealarmclock.com/euro/archives/2006/10/stockholmnfounded_vo.html
3
Voddler Raises $2.2M For Virtual Cable TV
[]
0
[]
19
0
1
pg
2006-10-09T19:59:03
0
0
0
[ 454428 ]
http://www.technologyreview.com/read_article.aspx?id=17588&ch=energy
2
Will Silicon Light Illuminate the Future?
[]
0
[]
20
0
1
pg
2006-10-09T20:00:38
0
0
0
[ 23, 454429 ]
http://avc.blogs.com/a_vc/2006/10/search_by_salar.html
8
Salaries at VC-backed companies
[]
1
[]
21
0
1
sama
2006-10-10T01:21:11
0
0
0
[ 22, 454430 ]
http://www.techcrunch.com/2006/10/09/google-has-acquired-youtube/
6
Best IRR ever? YouTube 1.65B...
[]
1
[]
22
0
2
pg
2006-10-10T02:18:22
It's kind of funny that Sevin Rosen is giving up at the same time Sequoia is scoring on this scale.
0
21
0
[ 454431 ]
0
[]
0
[ "at", "funny", "giving", "is", "it", "kind", "of", "on", "rosen", "s", "same", "scale", "scoring", "sequoia", "sevin", "that", "the", "this", "time", "up" ]
23
0
2
starklysnarky
2006-10-10T02:30:53
This is interesting, but the limitations become apparent with one of their example searches: http://tinyurl.com/qqnum This comparison shows "early stage" companies offering a higher salary than fortune XXX companies. However, these numbers don't seem to indicate total compensation, such as benefits and stock or options. Still, a cool use of search technology.
0
20
0
[ 454432 ]
0
[]
0
[ "a", "and", "apparent", "as", "become", "benefits", "but", "com", "companies", "comparison", "compensation", "cool", "don", "early", "example", "fortune", "higher", "however", "http", "indicate", "interesting", "is", "limitations", "numbers", "of", "offering", "on...
24
0
1
starklysnarky
2006-10-10T03:54:25
0
0
0
[ 454433 ]
http://www.startup-review.com/blog/myspace-case-study-not-a-purely-viral-start.php
9
MySpace: Not a purely viral start
[]
0
[]
25
0
1
starklysnarky
2006-10-10T05:33:32
0
0
0
[ 454436 ]
http://www.usatoday.com/money/industries/technology/2004-04-21-sas-culture_x.htm
5
A Story About Not Going IPO During The Bubble
[]
0
[]
26
0
1
zak
2006-10-10T06:12:13
1
0
0
[ 454437 ]
http://www.nytimes.com/2006/09/10/business/yourmoney/10stra.html?ex=1315540800&en=6889d6d8b8f28c5a&ei=5090&partner=rssuserland&emc=rss
1
Spam stock tips work - for spammers
[]
-1
[]
27
0
1
spez
2006-10-10T14:04:03
0
0
0
[ 454438 ]
http://www.google.com/press/pressrel/google_youtube.html
12
Google Acquires YouTube For $1.6B
[]
0
[]
28
0
1
spez
2006-10-10T14:35:10
1
0
0
[]
http://www.google.com/press/pressrel/google_youtube.html
1
google!
[]
-1
[]
29
0
1
spez
2006-10-10T15:34:50
1
0
0
[ 30 ]
http://news.ycombinator.com/comments?id=29
2
How to Get #1 on news.yc
[]
0
[]
30
0
2
spez
2006-10-10T15:34:59
Stay tuned...
0
29
0
[ 31, 454446 ]
0
[]
0
[ "stay", "tuned" ]
31
0
2
pg
2006-10-10T15:40:05
I'm tuned...
0
30
0
[ 33, 454447 ]
0
[]
0
[ "i", "m", "tuned" ]
32
0
1
eshear
2006-10-10T15:43:00
0
0
0
[ 454448 ]
http://scratchtop.com
5
Scratchtop - notepad for the web
[]
0
[]
33
0
2
spez
2006-10-10T15:50:40
winnar winnar chicken dinnar!
0
31
0
[ 34, 454450 ]
0
[]
0
[ "chicken", "dinnar", "winnar" ]
34
0
2
pg
2006-10-10T15:53:53
what do you mean? this story's still not #1
0
33
0
[ 36, 35, 454452, 454451, 531706 ]
0
[]
0
[ "1", "do", "mean", "not", "s", "still", "story", "this", "what", "you" ]
35
0
2
spez
2006-10-10T15:57:42
perhaps if i hadn't told you it was coming http://lipstick.com/static/onlyforamoment.png
0
34
0
[]
0
[]
0
[ "com", "coming", "hadn", "http", "i", "if", "it", "lipstick", "onlyforamoment", "perhaps", "png", "static", "t", "told", "was", "you" ]
36
0
2
pg
2006-10-10T16:01:01
Can you do it again?
0
34
0
[ 454454 ]
0
[]
0
[ "again", "can", "do", "it", "you" ]
37
0
1
gaborcselle
2006-10-10T20:13:54
0
0
0
[ 41, 454455 ]
http://marcusfoster.com/blog/2006/10/09/woah-scrybe/
4
Woah, Scrybe! - A new kind of online organiser.
[]
1
[]
38
0
1
farmer
2006-10-10T20:44:53
0
0
0
[ 454457 ]
http://www.nytimes.com/2006/10/10/technology/10deal.html?ex=1318132800&en=d8a82aacfcbbe1ee&ei=5090&partner=rssuserland&emc=rss
3
Dot-Com Boom Echoed in Deal to Buy YouTube - New York Times
[]
0
[]
39
0
1
frobnicate
2006-10-10T20:55:50
0
0
0
[ 454458 ]
http://www.techcrunch.com/2006/10/04/competitous-track-your-competition-online/
3
Competitio.us: track your competitors online
[]
0
[]
40
0
1
pg
2006-10-10T20:59:01
0
0
0
[ 454460 ]
http://www.cioinsight.com/print_article2/0,1217,a=190067,00.asp
3
Scoble: Google web apps "will sneak in the back door"
[]
0
[]
41
0
2
starklysnarky
2006-10-10T22:46:08
it's interesting how a simple set of features can make the product seem wholly different, new, and interesting. that, and the use of 'revolutionary' and 'incredible' everywhere. =)
0
37
0
[ 454461 ]
0
[]
0
[ "a", "and", "can", "different", "everywhere", "features", "how", "incredible", "interesting", "it", "make", "new", "of", "product", "revolutionary", "s", "seem", "set", "simple", "that", "the", "use", "wholly" ]
42
0
1
sergei
2006-10-11T02:10:01
0
0
0
[ 28355, 28717, 454463 ]
http://www.venturebeat.com/contributors/2006/10/10/an-alternative-to-vc-selling-in/
5
An alternative to VC: "Selling In"
[]
0
[]
43
0
1
farmer
2006-10-11T14:45:01
0
0
0
[ 44, 454464 ]
http://www.nytimes.com/2006/10/11/technology/11yahoo.html?ex=1318219200&en=538f73d9faa9d263&ei=5090&partner=rssuserland&emc=rss
3
Yahoo's Growth Being Eroded by New Rivals - New York Times
[]
1
[]
44
0
2
spez
2006-10-11T15:00:48
Welcome back, Randall
0
43
0
[ 454465 ]
0
[]
0
[ "back", "randall", "welcome" ]
45
0
1
perler
2006-10-11T15:04:38
0
0
0
[ 454466 ]
http://www.techcrunch.com/2006/10/11/coghead-goes-live-build-applications-visually/
5
Coghead Goes Live: Build Applications Visually
[]
0
[]
46
0
1
goldfish
2006-10-11T15:39:28
0
0
0
[ 454470 ]
http://www.rentometer.com/
4
Rentometer: Check How Your Rent Compares to Others in Your Area
[]
0
[]
47
0
1
onebeerdave
2006-10-11T21:47:48
0
0
0
[ 454471 ]
http://business2.blogs.com/business2blog/2006/10/startup_watch_c.html
2
Another audience-driven news site that thinks it will win by paying users
[]
0
[]
48
0
1
zak
2006-10-12T03:52:03
1
0
0
[ 454472 ]
http://www.nytimes.com/2006/10/12/technology/12tube.html
9
The richest graduate student ever
[]
-1
[]
49
0
1
pg
2006-10-12T13:30:14
0
0
0
[ 454473 ]
http://www.techcrunch.com/2006/10/11/preezo-enters-online-office-race/
4
Preezo Enters Online Office Race
[]
0
[]
50
0
1
pg
2006-10-12T13:59:30
0
0
0
[ 454474 ]
http://masshightech.bizjournals.com/masshightech/stories/2006/09/25/story8.html
4
Boston VCs invest in fewer consumer Internet startups than Silicon Valley firms
[]
0
[]
51
0
1
phyllis
2006-10-12T19:13:26
0
0
0
[ 454475 ]
http://www.techcrunch.com/2006/10/12/how-much-money-do-you-make/
3
How Much Money Do You Make?
[]
0
[]
52
0
1
onebeerdave
2006-10-12T19:19:00
0
0
0
[ 454476 ]
http://polls.gigaom.com/2006/10/09/goobed/
2
Winners and Losers in the Google YouTube deal
[]
0
[]
53
0
1
gaborcselle
2006-10-12T22:38:42
0
0
0
[ 454477 ]
http://rondam.blogspot.com/2006/10/top-ten-geek-business-myths.html
7
Top ten geek business myths
[]
0
[]
54
0
1
phyllis
2006-10-13T14:45:22
0
0
0
[ 454478 ]
http://www.techcrunch.com/2006/10/12/the-payperpost-virus-spreads/
1
The PayPerPost Virus Spreads
[]
0
[]
55
0
1
perler
2006-10-13T14:46:50
0
0
0
[ 454479 ]
http://www.techcrunch.com/2006/10/13/realtravel-trip-planner-cut-paste-share-travel-tips/
2
RealTravel Trip Planner: Cut, Paste & Share Travel Tips
[]
0
[]
56
0
1
pg
2006-10-14T15:43:45
0
0
0
[ 454480 ]
http://www.nytimes.com/2006/10/15/movies/15waxm.html?ex=1318564800&en=e5fbcfe899f7c41d&ei=5090&partner=rssuserland&emc=rss
4
Cyberface: New technology that "captures the soul"
[]
0
[]
57
0
1
adamsmith
2006-10-14T22:55:41
0
0
0
[ 439383, 454481 ]
http://www.rampantgames.com/blog/2004/10/black-triangle.html
12
Black triangle: a useful shorthand and metaphor
[]
1
[]
58
0
1
frobnicate
2006-10-15T22:48:30
0
0
0
[ 454482 ]
http://www.nytimes.com/2006/10/15/business/yourmoney/15friend.html?ex=1318564800&en=3e9438ed349f7ce7&ei=5090&partner=rssuserland&emc=rss
7
NYT: How Friendster Blew It
[]
0
[]
59
0
1
gaborcselle
2006-10-17T08:32:33
0
0
0
[ 454483 ]
http://www.nytimes.com/2006/10/17/technology/17paypal.html?ex=1318737600&en=6fcef6809e87f6fc&ei=5090&partner=rssuserland&emc=rss
3
NYT on the network of ex-PayPal employees in Silicon Valley
[]
0
[]
60
0
1
zak
2006-10-21T04:56:50
1
0
0
[ 454484 ]
http://www.nytimes.com/2006/10/15/business/yourmoney/15friend.html?ex=1318564800&en=3e9438ed349f7ce7&ei=5090&partner=rssuserland&emc=rss
2
How Friendster Bit The Dust (sorry, duplicate)
[]
-1
[]
61
0
1
goldfish
2006-10-22T18:29:55
0
0
0
[ 454485 ]
http://www.nytimes.com/2006/10/22/business/yourmoney/22digi.html?ex=1319169600&en=fc0cc8346b8b9382&ei=5090&partner=rssuserland&emc=rss
2
VCs Prefer to Fund Nearby Firms - New York Times
[]
0
[]
62
0
1
pg
2006-12-14T16:12:10
0
0
0
[ 454486 ]
http://rura.org/blog/2006/12/13/productivity-surges-amid-reddit-downtime/
3
Productivity Surges During Reddit Downtime
[]
0
[]
63
0
1
kul
2007-02-19T02:30:24
0
0
0
[ 2686, 554, 454488 ]
http://news.bbc.co.uk/2/hi/business/6355289.stm
25
From Oxford to Silicon Valley
[]
2
[]
64
0
1
pg
2007-02-19T02:30:26
1
0
0
[ 917 ]
http://news.bbc.co.uk/2/hi/business/6355289.stm
6
From Oxford to Silicon Valley
[]
0
[]
65
0
1
perler
2007-02-19T02:38:20
0
0
0
[ 454490 ]
http://localglobe.blogspot.com/2007/02/y-europe-can-seed-growth-of-its-new.html
13
Y Europe can seed growth of its new stars
[]
0
[]
66
0
1
pg
2007-02-19T02:42:31
0
0
0
[ 454491 ]
http://www.techcrunch.com/2007/02/16/newest-flash-tools-on-display-at-photobucket/
12
Adobe gives new flash tools only to Photobucket
[]
0
[]
67
0
1
zak
2007-02-19T03:01:38
1
0
0
[ 166, 454493 ]
http://www.nytimes.com/2007/01/18/business/18edge.html?ex=1326776400&en=50506a28938a5850&ei=5088&partner=rssnyt&emc=rss
14
How to make money with a PhD
[]
0
[]
68
0
1
zak
2007-02-19T03:04:21
1
0
0
[ 454494 ]
http://www.nytimes.com/2006/11/20/technology/20ecom.html?ex=1321678800&en=a850316060f8dcca&ei=5088&partner=rssnyt&emc=rss
3
A website that makes you look good
[]
-1
[]
69
0
1
zak
2007-02-19T03:07:36
1
0
0
[ 76, 454495 ]
http://www.nytimes.com/2006/11/09/technology/09venture.html?ex=1320728400&en=550d24cae21caac0&ei=5088&partner=rssnyt&emc=rss
7
For Startups, Web Success on the Cheap
[]
0
[]
70
0
1
zak
2007-02-19T03:08:44
1
0
0
[ 82, 454496 ]
http://www.nytimes.com/2006/10/29/business/yourmoney/29women.html?ex=1319774400&en=bca62d147f69b73e&ei=5088&partner=rssnyt&emc=rss
8
Make something women want
[]
0
[]
71
0
1
zak
2007-02-19T03:12:10
1
0
0
[ 454497 ]
http://www.nytimes.com/2006/09/10/business/yourmoney/10stra.html?ex=1315540800&en=275aa05a8358b924&ei=5088&partner=rssnyt&emc=rss
2
If your startup tanks, become a spammer - it pays
[]
-1
[]
72
0
1
onebeerdave
2007-02-19T03:23:39
0
0
0
[ 454498 ]
http://www.techcrunch.com/2007/02/18/new-revenue-stream-for-bloggers-textmark-sms-alerts/
3
New Revenue Stream For Bloggers: TextMark SMS Alerts
[]
0
[]
73
0
1
onebeerdave
2007-02-19T03:29:57
1
0
0
[]
http://www.techcrunch.com/2007/02/18/new-revenue-stream-for-bloggers-textmark-sms-alerts/
1
New Revenue Stream For Bloggers: TextMark SMS Alerts
[]
-1
[]
74
0
1
phyllis
2007-02-19T03:57:19
0
0
0
[ 454500 ]
http://startupschool.org/
37
Startup School 2007: 3/24 at Stanford
[]
0
[]
75
0
1
phyllis
2007-02-19T04:07:20
0
0
0
[ 200, 454501 ]
http://www.paulgraham.com/foundersatwork.html
13
Learning from Founders
[]
1
[]
76
0
2
matt
2007-02-19T04:11:42
>> “I came to the conclusion that $500,000 was the new $5 million,” said Michael Maples Jr., an entrepreneur who created a $15 million venture fund aimed at investing in companies that required little capital. Pfft. $18k is the new $500k.
0
69
0
[ 454503 ]
0
[]
0
[ "000", "15", "18k", "5", "500", "500k", "a", "aimed", "an", "at", "came", "capital", "companies", "conclusion", "created", "entrepreneur", "fund", "i", "in", "investing", "is", "jr", "little", "maples", "michael", "million", "new", "pfft", "required", "said"...
77
0
1
phyllis
2007-02-19T04:13:45
0
0
0
[ 79, 454504 ]
http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2007/02/14/MNGEVO4DOV1.DTL&hw=jessica+guynn&sn=001&sc=1000
9
Tech's younger generation leans on Web 2.0 for love
[]
2
[]
78
0
1
matt
2007-02-19T04:21:56
0
0
0
[ 454505 ]
http://mashable.com/2007/02/16/alexas-inaccurate-traffic-stats-become-more-detailed/
8
Alexa's Inaccurate Traffic Stats Become More Detailed
[]
0
[]
79
0
2
matt
2007-02-19T04:49:59
Launch party circuit, eh?
0
77
0
[ 80, 454506 ]
0
[]
0
[ "circuit", "eh", "launch", "party" ]
80
0
2
pg
2007-02-19T04:51:22
yet another advantage of being in the Bay Area
0
79
0
[ 454507 ]
0
[]
0
[ "advantage", "another", "area", "bay", "being", "in", "of", "the", "yet" ]
81
0
1
justin
2007-02-19T07:12:10
0
0
0
[ 454508 ]
http://www.techcrunch.com/2007/02/16/allfreecalls-shut-down/
5
allfreecalls.com shut down by AT&T
[]
0
[]
82
0
2
plusbryan
2007-02-19T07:53:14
dude, shoutfit!
0
70
0
[ 454509 ]
0
[]
0
[ "dude", "shoutfit" ]
83
0
1
gaborcselle
2007-02-19T09:42:28
0
0
0
[]
http://www.gaborcselle.com/blog/2006/12/why-startups-dont-condense-in-europe.html
12
Why Startups Don't Condense in Europe
[]
0
[]
84
0
1
nate
2007-02-19T17:36:02
0
0
0
[ 454514 ]
http://code.inklingmarkets.com/journal/2007/2/16/sell-more-of-your-product-by-throwing-out-some-of-its-features.html
6
Sell more of your product by throwing out some of its features?
[]
0
[]
85
0
1
nate
2007-02-19T17:38:58
0
0
0
[ 85945, 454515 ]
http://lifehacker.com/software/diy/diy-underdesk-gadget-mount-237789.php
5
Hack your desk to remove the clutter
[]
0
[]
86
0
1
adam_inkling
2007-02-19T17:45:54
0
0
0
[ 130, 454516 ]
http://unfuddle.com
10
Project management alternative to basecamp - ticketing and version control too :)
[]
1
[]
87
0
1
adam_inkling
2007-02-19T17:46:48
1
0
0
[]
http://unfuddle.com
1
Project management alternative to basecamp - ticketing and version control too :)
[]
-1
[]
88
0
1
adam_inkling
2007-02-19T17:47:04
1
0
0
[]
http://unfuddle.com
1
Project management alternative to basecamp - ticketing and version control too :)
[]
-1
[]
89
0
1
nate
2007-02-19T17:47:26
0
0
0
[ 454517 ]
http://macapper.com/2007/02/19/win-a-copy-of-yojimbo/
2
Win A Copy Of Yojimbo - a pretty good organizer for the Mac.
[]
0
[]
90
0
1
adam_inkling
2007-02-19T17:50:35
0
0
0
[ 454518 ]
http://famfamfam.com
11
Where all the cool kids get their icons
[]
0
[]
91
0
1
pc
2007-02-19T18:01:03
0
0
0
[ 454519 ]
http://www.washingtonpost.com/wp-dyn/content/article/2007/02/11/AR2007021101198.html?nav=rss_technology
5
SunRocket cofounders up and leave
[]
0
[]
92
0
1
greg
2007-02-19T18:09:20
0
0
0
[ 454520 ]
http://www.clicktale.com/
10
Track users' mouse movements on your webpages
[]
0
[]
93
0
1
pc
2007-02-19T18:19:02
0
0
0
[ 454521, 151 ]
http://www.searchmash.com/search/star+wars+kid
6
Google's Searchmash adds inline video
[]
0
[]
94
0
1
thecurve
2007-02-19T18:19:41
0
0
0
[ 454522 ]
http://www.chron.com/disp/story.mpl/ap/fn/4552107.html
11
Premium GMail Coming Soon - $25/yr for 6 Gigs
[]
0
[]
95
0
1
pc
2007-02-19T18:21:46
0
0
0
[ 454523 ]
http://www.iht.com/articles/2007/02/19/business/piracy.php
8
YouTube: "identifying copyrighted material can't be an automated process." Startup disagrees.
[]
0
[]
96
0
1
thecurve
2007-02-19T18:24:20
0
0
0
[ 454524 ]
http://googleblog.blogspot.com/2007/01/show-us-your-university-campus-in-3d.html
4
Google's Build Your Campus in 3D Competition
[]
0
[]
97
0
1
emmett
2007-02-19T18:29:22
0
0
0
[ 454525 ]
http://www.readwriteweb.com/archives/online_video_index.php
11
An Overview of the Five Gazillion Video Startups
[]
0
[]
98
0
1
farmer
2007-02-19T18:35:19
0
0
0
[ 454527 ]
http://www.voip-news.com/feature/25-most-interesting-voip-startups-021207/
6
25 Most Interesting VoIP Startups
[]
0
[]
99
0
1
pg
2007-02-19T18:37:29
0
0
0
[ 454528 ]
http://blog.radioactiveyak.com/2006/07/googleoffice-beta-google-powered.html
5
The Google-Powered Business
[]
0
[]
100
0
1
pc
2007-02-19T18:38:08
0
0
0
[ 454529, 152 ]
http://www.linux-watch.com/news/NS7616991195.html
6
SpikeSource, CA-based startup, becomes Ubuntu commercial support provider for US
[]
0
[]
End of preview. Expand in Data Studio

Hacker News - Complete Archive

Every Hacker News item since 2006, live-updated every 5 minutes

What is it?

This dataset contains the complete Hacker News archive: every story, comment, Ask HN, Show HN, job posting, and poll ever submitted to the site. Hacker News is one of the longest-running and most influential technology communities on the internet, operated by Y Combinator since 2007. It has become the de facto gathering place for founders, engineers, researchers, and technologists to share and discuss what matters in technology.

The archive currently spans from 2006-10 to 2026-03-21 10:35 UTC, with 47,394,748 items committed. New items are fetched every 5 minutes and committed directly as individual Parquet files through an automated live pipeline, so the dataset stays current with the site itself.

We believe this is one of the most complete and regularly updated mirrors of Hacker News data available on Hugging Face. The data is stored as monthly Parquet files sorted by item ID, making it straightforward to query with DuckDB, load with the datasets library, or process with any tool that reads Parquet.

What is being released?

The dataset is organized as one Parquet file per calendar month, plus 5-minute live files for today's activity. Every 5 minutes, new items are fetched from the source and committed directly as a single Parquet block. At midnight UTC, the entire current month is refetched from the source as a single authoritative Parquet file, and today's individual 5-minute blocks are removed from the today/ directory.

data/
  2006/2006-10.parquet       first month with HN data
  2006/2006-12.parquet
  2007/2007-01.parquet
  ...
  2026/2026-03.parquet   most recent complete month
  2026/2026-03.parquet  current month, ongoing til 2026-03-20
today/
  2026/03/21/00/00.parquet  5-min live blocks (YYYY/MM/DD/HH/MM.parquet)
  2026/03/21/00/05.parquet
  ...
  2026/03/21/10/35.parquet  most recent committed block
stats.csv                    one row per committed month
stats_today.csv              one row per committed 5-min block

Along with the Parquet files, we include stats.csv which tracks every committed month with its item count, ID range, file size, fetch duration, and commit timestamp. This makes it easy to verify completeness and track the pipeline's progress.

Breakdown by today

The chart below shows items committed to this dataset by hour today (2026-03-21, 2,715 items across 11 hours, last updated 2026-03-21 10:40 UTC).

  00:00  ██████████████████████████████  339
  01:00  ██████████████████████████░░░░  304
  02:00  ████████████████████░░░░░░░░░░  232
  03:00  ██████████████████░░░░░░░░░░░░  212
  04:00  ███████████████████░░░░░░░░░░░  219
  05:00  ██████████████████░░░░░░░░░░░░  206
  06:00  ██████████████████░░░░░░░░░░░░  212
  07:00  ████████████████████████░░░░░░  282
  08:00  █████████████████████████░░░░░  291
  09:00  ████████████████████████░░░░░░  278
  10:00  ████████████░░░░░░░░░░░░░░░░░░  140

Breakdown by year

The chart below shows items committed to this dataset by year.

  2006  █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░  62
  2007  █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░  93.8K
  2008  ██░░░░░░░░░░░░░░░░░░░░░░░░░░░░  320.9K
  2009  ███░░░░░░░░░░░░░░░░░░░░░░░░░░░  608.4K
  2010  ██████░░░░░░░░░░░░░░░░░░░░░░░░  1.0M
  2011  ████████░░░░░░░░░░░░░░░░░░░░░░  1.4M
  2012  ██████████░░░░░░░░░░░░░░░░░░░░  1.6M
  2013  █████████████░░░░░░░░░░░░░░░░░  2.0M
  2014  ███████████░░░░░░░░░░░░░░░░░░░  1.8M
  2015  █████████████░░░░░░░░░░░░░░░░░  2.0M
  2016  ████████████████░░░░░░░░░░░░░░  2.5M
  2017  █████████████████░░░░░░░░░░░░░  2.7M
  2018  ██████████████████░░░░░░░░░░░░  2.8M
  2019  ████████████████████░░░░░░░░░░  3.1M
  2020  ████████████████████████░░░░░░  3.7M
  2021  ███████████████████████████░░░  4.2M
  2022  █████████████████████████████░  4.4M
  2023  ██████████████████████████████  4.6M
  2024  ████████████████████████░░░░░░  3.7M
  2025  █████████████████████████░░░░░  3.9M
  2026  ██████░░░░░░░░░░░░░░░░░░░░░░░░  1.0M

How to download and use this dataset

You can load the full dataset, a specific year, or even a single month. The dataset uses the standard Hugging Face Parquet layout, so it works out of the box with DuckDB, the datasets library, pandas, and huggingface_hub.

Using DuckDB

DuckDB can read Parquet files directly from Hugging Face without downloading anything first. This is the fastest way to explore the data:

The type column is stored as a small integer: 1 = story, 2 = comment, 3 = poll, 4 = pollopt, 5 = job. The "by" column (author username) must be quoted in DuckDB because by is a reserved keyword.

-- Top 20 highest-scored stories of all time
SELECT id, title, "by", score, url, time
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND title != ''
ORDER BY score DESC
LIMIT 20;
-- Monthly submission volume for a specific year
SELECT
    strftime(time, '%Y-%m') AS month,
    count(*) AS items,
    count(*) FILTER (WHERE type = 1) AS stories,
    count(*) FILTER (WHERE type = 2) AS comments
FROM read_parquet('hf://datasets/open-index/hacker-news/data/2024/*.parquet')
GROUP BY month
ORDER BY month;
-- Most discussed stories by total comment count
SELECT id, title, "by", score, descendants AS comments, url
FROM read_parquet('hf://datasets/open-index/hacker-news/data/2025/*.parquet')
WHERE type = 1 AND descendants > 0
ORDER BY descendants DESC
LIMIT 20;
-- Who posts the most Ask HN questions?
SELECT "by", count(*) AS posts
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND title LIKE 'Ask HN:%'
GROUP BY "by"
ORDER BY posts DESC
LIMIT 20;
-- Track how often a topic appears on HN over time
SELECT
    extract(year FROM time) AS year,
    count(*) AS mentions
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND lower(title) LIKE '%rust%'
GROUP BY year
ORDER BY year;
-- Top linked domains, year over year
SELECT
    extract(year FROM time) AS year,
    regexp_extract(url, 'https?://([^/]+)', 1) AS domain,
    count(*) AS stories
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND url != ''
GROUP BY year, domain
QUALIFY row_number() OVER (PARTITION BY year ORDER BY stories DESC) <= 5
ORDER BY year, stories DESC;

Using datasets

from datasets import load_dataset

# Stream the full history without downloading everything first
ds = load_dataset("open-index/hacker-news", split="train", streaming=True)
for item in ds:
    print(item["id"], item["type"], item["title"])

# Load a specific year into memory
ds = load_dataset(
    "open-index/hacker-news",
    data_files="data/2024/*.parquet",
    split="train",
)
print(f"{len(ds):,} items in 2024")

# Load today's live blocks (updated every 5 minutes)
ds = load_dataset(
    "open-index/hacker-news",
    name="today",
    split="train",
    streaming=True,
)

Using huggingface_hub

from huggingface_hub import snapshot_download

# Download only 2024 data (about 1.5 GB)
snapshot_download(
    "open-index/hacker-news",
    repo_type="dataset",
    local_dir="./hn/",
    allow_patterns="data/2024/*",
)

For faster downloads, install pip install huggingface_hub[hf_transfer] and set HF_HUB_ENABLE_HF_TRANSFER=1.

Using the CLI

# Download a single month
huggingface-cli download open-index/hacker-news \
    data/2024/2024-01.parquet \
    --repo-type dataset --local-dir ./hn/

Using pandas + DuckDB

import duckdb

conn = duckdb.connect()

# Score distribution: what does a "typical" HN story look like?
# type=1 is story (stored as integer: 1=story, 2=comment, 3=poll, 4=pollopt, 5=job)
df = conn.sql("""
    SELECT
        percentile_disc(0.50) WITHIN GROUP (ORDER BY score) AS p50,
        percentile_disc(0.90) WITHIN GROUP (ORDER BY score) AS p90,
        percentile_disc(0.99) WITHIN GROUP (ORDER BY score) AS p99,
        percentile_disc(0.999) WITHIN GROUP (ORDER BY score) AS p999
    FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
    WHERE type = 1
""").df()
print(df)

Dataset statistics

You can query the per-month statistics directly from the stats.csv file included in the dataset:

SELECT * FROM read_csv_auto('hf://datasets/open-index/hacker-news/stats.csv')
ORDER BY year, month;

The stats.csv file tracks each committed month with the following columns:

Column Description
year, month Calendar month
lowest_id, highest_id Item ID range covered by this file
count Number of items in the file
dur_fetch_s Seconds to fetch from the data source
dur_commit_s Seconds to commit to Hugging Face
size_bytes Parquet file size on disk
committed_at ISO 8601 timestamp of when this month was committed

Content breakdown

Hacker News has five item types. The vast majority of content is comments, followed by stories (which include Ask HN, Show HN, and regular link submissions). Jobs, polls, and poll options make up a small fraction.

Type Count Share
comment 41,317,357 87.2%
story 6,040,377 12.7%
job 18,071 0.0%
poll 2,240 0.0%
pollopt 15,449 0.0%

Of all stories submitted to Hacker News, 84.8% link to an external URL. The rest are text-only posts: Ask HN questions, Show HN launches, and other self-posts where the discussion itself is the content.

The average story generates 23.9 comments in its discussion thread. The most-discussed story of all time received 9,275 comments, which gives a sense of how deep conversations can go on particularly controversial or interesting topics.

Story scores

Scores on Hacker News follow a steep power law. Most stories receive only a few points, but a small number break out and reach the front page with hundreds or thousands of upvotes.

Metric Value
Average score 1.5
Median score 0
Highest score ever 6,015
Stories with 100+ points 175,903
Stories with 1,000+ points 2,169

The median score of 0 reflects the fact that many stories are submitted but never gain traction. However, the long tail is where things get interesting: over 6,040,377 stories have been submitted, and the top 0.03% (those with 1,000+ points) represent the content that defined conversations across the technology industry.

Most-shared domains

The domains most frequently linked from Hacker News stories tell a clear story about what the community values. GitHub dominates, reflecting HN's deep roots in open source and software development. Major publications like the New York Times and Ars Technica show the community's interest in journalism and long-form analysis.

# Domain Stories
1 github.com 197,669
2 www.youtube.com 134,831
3 medium.com 124,544
4 www.nytimes.com 77,678
5 en.wikipedia.org 54,401
6 techcrunch.com 54,185
7 twitter.com 50,542
8 arstechnica.com 47,066
9 www.theguardian.com 44,304
10 www.bloomberg.com 37,798

Most active story submitters

These are the users who have submitted the most stories over the lifetime of Hacker News. Many of them have been active for over a decade, consistently curating and sharing content with the community.

# User Stories
1 rbanffy 36,778
2 Tomte 26,183
3 tosh 24,062
4 bookofjoe 20,588
5 mooreds 20,368
6 pseudolus 19,909
7 PaulHoule 19,025
8 todsacerdoti 18,880
9 ingve 17,056
10 thunderbong 15,978
11 jonbaer 14,167
12 rntn 13,410
13 doener 12,806
14 Brajeshwar 12,346
15 LinuxBender 11,058

How it works

The pipeline is built in Go and uses DuckDB for Parquet conversion. Historical data is sourced from ClickHouse; live data is fetched directly from the HN Firebase API.

Historical backfill. The pipeline iterates through every month from October 2006 to the most recent complete month. For each month, it queries the ClickHouse source with a time-bounded SQL query, exports the result as a Parquet file sorted by id using DuckDB with Zstandard compression at level 22, and commits it to this repository along with an updated stats.csv and README.md. Months already tracked in stats.csv are skipped, making the process fully resumable.

Live polling. Every 5 minutes, the pipeline calls the HN Firebase API to fetch new items by ID range. Items are grouped into their 5-minute time windows, written as individual Parquet files at today/YYYY/MM/DD/HH/MM.parquet using DuckDB, and committed to Hugging Face immediately. Using the HN API directly means live blocks reflect real-time data with no indexing lag.

Day rollover. At midnight UTC, the entire current month is refetched from the ClickHouse source in a single query and written as an authoritative Parquet file. Today's individual 5-minute blocks are deleted from the repository in the same atomic commit. Refetching instead of merging ensures the monthly file is always complete and deduplicated, regardless of any local state.

Thanks

The data in this dataset comes from the ClickHouse Playground, a free public SQL endpoint maintained by ClickHouse, Inc. that mirrors the official Hacker News Firebase API. ClickHouse uses Hacker News as one of their canonical demo datasets. Without their public endpoint, building and maintaining a complete, regularly updated archive like this would not be practical.

The original content is created by the Hacker News community and is operated by Y Combinator. This is an independent mirror and is not affiliated with or endorsed by Y Combinator or ClickHouse, Inc.

Dataset card for Hacker News - Complete Archive

Dataset summary

This dataset is a complete mirror of the Hacker News archive, sourced from the ClickHouse Playground which itself mirrors the official HN Firebase API. The data covers every item ever posted to the site, from the earliest submissions in October 2006 through today.

The dataset is intended for research, analysis, and training. Common use cases include:

  • Language model pretraining and fine-tuning on high-quality technical discussions
  • Sentiment and trend analysis across two decades of technology discourse
  • Community dynamics research on one of the internet's most influential forums
  • Information retrieval benchmarks using real-world questions and answers
  • Content recommendation and ranking model development

Dataset structure

Data instances

Here is an example item from the dataset. This is a story submission with a link to an external URL:

{
  "id": 1,
  "deleted": 0,
  "type": 1,
  "by": "pg",
  "time": "2006-10-09T18:21:51+00:00",
  "text": "",
  "dead": 0,
  "parent": 0,
  "poll": 0,
  "kids": [15, 234509, 487171],
  "url": "http://ycombinator.com",
  "score": 57,
  "title": "Y Combinator",
  "parts": [],
  "descendants": 0,
  "words": ["y", "combinator"]
}

And here is a comment, showing how discussion threads are connected via the parent field:

{
  "id": 15,
  "deleted": 0,
  "type": 2,
  "by": "sama",
  "time": "2006-10-09T19:51:01+00:00",
  "text": "\"the way to get good software is to find ...",
  "dead": 0,
  "parent": 1,
  "poll": 0,
  "kids": [17],
  "url": "",
  "score": 0,
  "title": "",
  "parts": [],
  "descendants": 0,
  "words": []
}

Data fields

Every Parquet file shares the same schema, matching the HN API item format:

Column Type Description
id uint32 Unique item ID, monotonically increasing across the entire site
deleted uint8 1 if the item was soft-deleted by its author or by moderators, 0 otherwise
type int8 Item type as an integer: 1=story, 2=comment, 3=poll, 4=pollopt, 5=job
by string Username of the author who created this item. Note: by is a reserved word in DuckDB and must be quoted as "by"
time timestamp When the item was created, in UTC
text string HTML body text. Used for comments, Ask HN posts, job listings, and polls
dead uint8 1 if the item was flagged or killed by moderators, 0 otherwise
parent uint32 The ID of the parent item. For comments, this points to either a story or another comment
poll uint32 For poll options (pollopt), the ID of the associated poll
kids list<uint32> Ordered list of direct child item IDs (typically comments)
url string The external URL for link stories. Empty for text posts and comments
score int32 The item's score (upvotes minus downvotes)
title string Title text for stories, jobs, and polls. Empty for comments
parts list<uint32> For polls, the list of associated poll option item IDs
descendants int32 Total number of comments in the entire discussion tree below this item
words list<string> Tokenized words extracted from the title and text fields

Data splits

The default configuration includes all historical monthly Parquet files. If you only need today's latest items, use the today configuration which includes only the 5-minute live blocks for the current day.

You can also load individual years or months by specifying data_files:

# Load just January 2024
ds = load_dataset("open-index/hacker-news", data_files="data/2024/2024-01.parquet", split="train")

# Load all of 2024
ds = load_dataset("open-index/hacker-news", data_files="data/2024/*.parquet", split="train")

Dataset creation

Curation rationale

Hacker News is one of the richest sources of technical discussion on the internet, but accessing the full archive programmatically has historically required either scraping the Firebase API item-by-item or working with incomplete third-party dumps. This dataset provides the complete archive in a standard, efficient format that anyone can query without setting up infrastructure.

By publishing on Hugging Face with Parquet files, the data becomes immediately queryable with DuckDB (via hf:// paths), streamable with the datasets library, and downloadable in bulk. The 5-minute live update pipeline means researchers always have access to near-real-time data.

Source data

All data is sourced from the ClickHouse Playground, a public SQL endpoint maintained by ClickHouse that mirrors the official Hacker News Firebase API. The ClickHouse mirror is widely used for analytics demonstrations and contains the complete dataset.

The pipeline queries the ClickHouse endpoint month-by-month, exports each month as a Parquet file using DuckDB with Zstandard compression at level 22, and commits it to this Hugging Face repository. Already-committed months are tracked in stats.csv and skipped on subsequent runs, making the process fully resumable.

Data processing steps

The pipeline runs in three modes:

  1. Historical backfill. Iterates through every month from October 2006 to the most recent complete month. For each month, it runs a SQL query against the ClickHouse source, writes the result as a Parquet file sorted by id, and commits it to Hugging Face along with an updated stats.csv and README.md.

  2. Live polling. After the historical backfill completes, the pipeline polls the HN Firebase API every 5 minutes for new items. It fetches all items with IDs greater than the last committed watermark, groups them into 5-minute time windows by item timestamp, and writes each window as a today/YYYY/MM/DD/HH/MM.parquet file committed to Hugging Face immediately. The HN API provides real-time data with no indexing lag.

  3. Day rollover. At midnight UTC, the entire current month is refetched from the ClickHouse source in a single query and written as a fresh, authoritative Parquet file. Today's individual 5-minute blocks are deleted from the repository in the same atomic commit. This approach is more reliable than merging local blocks — the result is always complete and deduplicated, sourced directly from the origin.

All Parquet files use Zstandard compression at level 22 and are sorted by id for efficient range scans. No filtering, deduplication, or transformation is applied to the data beyond what the source provides.

Personal and sensitive information

This dataset contains usernames (by field) and user-generated text content (text, title fields) as they appear on the public Hacker News website. No additional PII processing has been applied. The data reflects what is publicly visible on news.ycombinator.com.

If you find content in this dataset that you believe should be removed, please open a discussion on the Community tab.

Considerations for using the data

Social impact

By providing the complete Hacker News archive in an accessible format, we hope to enable research into online community dynamics, technology trends, and the evolution of technical discourse. The dataset can serve as training data for language models that need to understand technical discussions, or as a benchmark for information retrieval and recommendation systems.

Discussion of biases

Hacker News has a well-documented set of community biases. The user base skews heavily toward software engineers, startup founders, and technology enthusiasts based in the United States. Topics related to Silicon Valley, programming languages, startups, and certain political viewpoints tend to receive disproportionate attention and engagement.

The moderation system (flagging, vouching, and moderator intervention) shapes what content survives and what gets killed. Stories and comments that violate community norms are flagged as dead, but this moderation reflects the values of the existing community rather than any objective standard.

We have not applied any additional filtering or quality scoring to the data. All items, including deleted and dead items, are preserved exactly as they appear in the source.

Known limitations

  • type is an integer. The item type is stored as a TINYINT enum: 1=story, 2=comment, 3=poll, 4=pollopt, 5=job. When writing DuckDB queries, use WHERE type = 1 for stories rather than WHERE type = 'story'.
  • by is a reserved keyword in DuckDB. Always quote it with double quotes: "by".
  • deleted and dead are integers. They are stored as 0/1 rather than booleans.
  • Comment text is HTML. The text field contains raw HTML as stored by HN, not plain text. You may need to strip tags depending on your use case.
  • Deleted items have sparse fields. When an item is deleted, most fields become empty, but the id and deleted flag are preserved.
  • Scores are point-in-time snapshots. The score reflects the value at the time the ClickHouse mirror last synced, not necessarily the final score.
  • No user profiles. This dataset contains items only, not user profiles (karma, bio, etc.).
  • Code content is HTML-escaped. Code snippets in comments use HTML entities and <code> tags rather than Markdown formatting.

Additional information

Licensing

The dataset is released under the Open Data Commons Attribution License (ODC-By) v1.0. The original content is subject to the rights of its respective authors. Hacker News data is provided by Y Combinator.

This is an independent community mirror. It is not affiliated with or endorsed by Y Combinator.

Contact

For questions, feedback, or issues, please open a discussion on the Community tab.

Last updated: 2026-03-21 10:40 UTC

Downloads last month
5,124