wow, the site is so slow. getting "This resource could not be found" errors on reloads.

anyway, here's the text:

> Today was the first day that I could definitively say that #GPT4 has saved me a significant amount of tedious work. As part of my responsibilities as chair of the ICM Structure Committee, I needed to gather various statistics on the speakers at the previous ICM (for instance, how many speakers there were for each section, taking into account that some speakers were jointly assigned to multiple sections). The raw data (involving about 200 speakers) was not available to me in spreadsheet form, but instead in a number of tables in web pages and PDFs. In the past I would have resigned myself to the tedious taks of first manually entering the data into a spreadsheet and then looking up various spreadsheet functions to work out how to calculate exactly what I needed; but both tasks were easily accomplished in a few minutes by GPT4, and the process was even somewhat enjoyable (with the only tedious aspect being the cut-and-paste between the raw data, GPT4, and the spreadsheet).

> Am now looking forward to native integration of AI into the various software tools that I use, so that even the cut-and-paste step can be omitted. (Just being able to resolve >90% of LaTeX compilation issues automatically would be wonderful...)

Ironically, Tao's post convinces me that AI, though amazing, isn't really the solution. Better UX and data quality is. Why was the data so disjoint to begin with? Why is Latex so hard to work with?

In this case GPT-4 is used to solve a problem that shouldn't have even been one to begin with. The administrators of the ICM could've simply exported the raw data as a Google Sheet (for example) and his problem could've been trivially solved even without GPT-4.

HackerNews' hug of death hitting a site with normally 4k active users. Not that surprising...

it's plain text. it can be cached. it's embarrassing.

Feel free to share your know-how https://github.com/mastodon/mastodon