Hacker News new | past | comments | ask | show | jobs | submit login
Are we losing our ability to remember? (st.im)
243 points by scotthtaylor on Oct 28, 2020 | hide | past | favorite | 162 comments



I once took a class in college, “Cars and Culture”. I never forgot a line from that class - that “cars provided an extension of our legs”.

Ever since then, I have told many people and thought to myself many times, that tools, take Google for instance (search and indexing), knowledge management systems (Wiki and other techniques) - these are all extensions of our brains.

We evolve with technology, and it evolves with us. We might be losing our ability to remember, but if it is because we don’t “need” to remember because technology has augmented us... Well, this is why I also am fond of telling people that I have a difficult time separating technology from nature. Even though the two don’t seem like the same thing, technology too becomes part of the natural ecosystem as organisms invent and rely on it.

Also, another way of thinking about this is, maybe the ability to recall small detailed facts was evolutionarily less important than building models in our brains. So, we offloaded recording small facts, while I think we still ingest and build/train our neural nets just fine in our brains.

Then the only problem I see is, if life becomes all about mental models, when our ability to form new mental models degrades with age, what then? Especially with the rate of technological change, I do see a real likelihood that old mental models get left behind and without the ability to adapt, organisms (i.e.) us could be hosed.

Edit: ..and the last sentence could be why the big push for AI and machine learning too - to ensure the models get encoded into the technology too... and be discovered faster, changed more fluidly, etc. Another evolutionary tool.


I always thought that the most terrifying thing in the book 1984 (and the most relevant to the pending future) was that people weren't allowed tools to write anything down, they had to use the "speakwrite," which would monitor the user and refuse to write things that weren't allowed to be written.

Orwell wasn't prescient enough to imagine the type-remember, where you kept your entire perception of the world on machines that weren't controlled by institutions with your best interests at heart.

That had to wait for Fahrenheit 451 to create an entire administration and police force to get rid of your old encyclopedias.


I always thought that Brave New World was more prescient than 1984: you won't be forbid to write, we'll instead give you something more fun and distracting to do instead.


So many parallels between Brave New World and our current world. It's honestly disturbing. The class divide between rich and poor is in there. The intellectual divide between knowledge workers and laborers is in there. The dopamine consumption cycle that stymies creative thought of the masses is in there. The willful ignorance of there being any problems at all is in there. That book is disturbingly prophetic.

It's honestly worse in real life, because in the book the savages and intellectuals on islands live peacefully outside the regime. There is no option to not participate in our world, given that climate change will affect the entire planet; there is no safe refuge from the regime because it directly affects even those not participating in it.


It's almost as if these things have always existed.


And yet not.


"It's almost as if" -

I want to say something ideological, interpreting a given piece of evidence in a simplistic way to assert that it can be reduced to my own assumptions, but I also want to sound clever about it.

I half feel like making a bot that goes through reddit and just says "it's almost as if" whenever it detects anyone has made an assertion of fact of any kind, in increasingly implausible and unconnected ways.


Why not both? Brave New World for the masses and 1984 for the few that rock the boat.


Brave New World Revisted [0] has several references and comparisons to 1984. BNWR was written by Huxley in 1958, 26 years after BNW was published (1932) and 9 years after 1984 (1949).

It has been a few years for me, but there were several prescient chapters when I last ready it during the rise of populism around the globe (2015), namely sections IV, V, VI, and VII:

  Foreword
  I Over-Population
  II Quantity, Quality, Morality
  III Over-Organization
  IV Propaganda in a Democratic Society
  V Propaganda Under a Dictatorship
  VI The Arts of Selling
  VII Brainwashing
  VIII Chemical Persuasion
  IX Subconscious Persuasion
  X Hypnopaedia
  XI Education for Freedom
  XII What Can Be Done?
Would recommend the read to anyone that found 1984, BNW, F451, etc. interesting.

[0] https://www.huxley.net/bnw-revisited/ (full text)

Edit: formatting


Huxley wrote a letter to Orwell arguing the same.

http://www.openculture.com/2018/08/aldous-huxley-george-orwe...


For me personally, the internet has made my brain better at remembering associations between ideas, but worse at free-recall of factual information. It's like my mind got better at indexing at the expense of storage.


Perhaps the information "explosion", "deluge" or "overload" is part of it. There is so much - or too much - information, we can still ingest a lot, maybe we rely on making the connections and finding the generalities/mental models because there are too many specifics.


My mind got better at indexing at the expense of storage >> Well said. I am going to use your quote :) (so that it moves to my long-term memory.) Thank you.


I'm going to steal your quote too, if I can remember it ;)


That's actually the perfect explanation. Even studying cognitive science I'm always amazed at how analogous the brain is to a computer.


Ah, that's interesting, I read it the opposite way: computers are so different from the brain that the end result is that they complement each other really well.


Doug Engelbart, a pioneer of the personal computer, would love your point about computers being extensions of our brains. His mission (in the 1950s!) was to find a way to 'augment human intellect' through technology, much like the biologist augments their eyesight through the microscope. The mind blowing part is that they didn't have anything to go off (or XEROX to copy) - imagine conceptualizing the personal computer decades before it hit the mainstream. Sorry for the rant!

https://www.dougengelbart.org/content/view/155/87/


You might enjoy reading E.M. Forster's "The Machine Stops"[1][2][3], which in 1909 predicted something like the internet, internet addiction and withdrawal, chat rooms, video conferencing, online learning, widespread international air travel.

[1] - The story in written form - http://www.visbox.com/prajlich/forster.html

[2] - An audio recording - https://librivox.org/the-machine-stops-by-e-m-forster/

[3] - Wikipedia article - https://en.wikipedia.org/wiki/The_Machine_Stops


"[...] it's the equivalent of a bicycle for our minds." - Steve Jobs

https://youtu.be/ob_GX50Za6c


If you haven't yet, you should read Kevin Kelly's "What Technology Wants" it devotes quite a few pages to the idea of evolution/augmentation via technology.

https://www.amazon.com/What-Technology-Wants-Kevin-Kelly/dp/...


That's a dangerous path to tread, I would never compromise my cognitive abilities with technology, technology is an aid when I need it and I know when not to need it and try to use my God given brain. Suggest you start using your brain or risk loosing it, specially when you can't even remember what to search on Google.


When this same argument is applied to other places, you end up with stuff like "if you don't grow your own food, you're risking supermarkets disappearing and you starving".

It's true, but that doesn't mean it's not a worthwhile risk. Nor does it imply that the other approach is risk-free / has no downsides.


> cars provided an extension of our legs

Marshall McLuhan wrote that in 1964 in "Understanding Media: The Extensions of Man".


but the analogy to cars does not hold. They're not extensions to our legs in a genuine sense of the term. They don't improve, or interact with our physical capacities. On the contrary, they atrophy our legs, make us fatter, and produce urban pollution.

They're not a technology that enhances or melts with the human who uses it, it doesn't vanish into the background, the car doesn't even evolve much, rather the environment changes to fit the car, if anything hindering evolution, it displaces the natural ecosystem, it doesn't become part of it.

The same can be said about technologies weakening memory. They look like an enhancement maybe, but they may actually just cause impairment of function, becoming a crutch.


Your claim, that cars doesn’t become part of the ecosystem, got me thinking of this clip:

http://www.bbc.com/earth/storyoflife/player?clipID=20160713-...


Yeah, I agree with you (nicely put btw) — what I find disconcerting is the fact that it's happening so quick, i.e. I notice myself evolving and changing over weeks/months and becoming increasingly reliant upon the external data stores. Pretty crazy we can adapt that quick!


I have heard someone describe the US as being an 'attention economy' rather than 'information economy'. There is too much data being thrown at us to absorb it all. This has led people to increasingly rely on third parties to prioritize, filter, distill, and curate these data. The disintegration of revenue models around information production and the consolidation in media publishing and search has compounded the problem. The result is an increase in hucksterism, fraud, and misinformation. Nothing can ever be completely proved or disproved and bad ideas never disappear. It's a new mode for humanity, which will force us to find a new equilibrium.


Related: I have definitely lost my once lifelong habit of reading at least a book a week. It's down to about a book a year. Reading was such a big part of who I am and yet I now struggle to read a book. I keep buying them, however, I suppose in the hopes I can convince (shame?) myself to read by sheer volume of what I have bought and haven't read.


Set time aside: for example when eating breakfast, or before bed. Use a timer, set it to at least 20 minutes. Each minute you read per day, equals about 1 book per year. So if you read 20 minutes per day, you will read 20 books per year.


Same here. I rely on bookmarks and a lot of organized and tagged notes to recover the right things, but it works great!


"Are we losing our ability to walk?"

Yeah. Kinda. Not really. I bet people had to walk more before cars though.


Ability, perhaps not. But cars and car oriented planning are leading directly to obesity and heart disease. There’s a reason New Yorkers aren’t as fat as Ohioans.


mobility scooters


Steve Jobs had a video on this, about computers being bicycles for our minds: https://www.youtube.com/watch?v=rTRzYjoZhIY


> take Google for instance (search and indexing), knowledge management systems (Wiki and other techniques) - these are all extensions of our brains.

I agree. But these two examples are different in a way that is important to me: my personal wiki is under my control; Google is not. Therefore, when I find important things in Google, I sometimes still take care to rewrite them into my wiki, using my own words.

Using the wiki feels like extending my brain. Using Google feels more like outsourcing it.


You just described the field of "distributed cognition"


I feel that in general we are consuming a lot more information than we really need. I am not sure the human brain was evolved to handle a large amount of information that does not have any physical references.

I find myself not only forgetting myriad facts and figures, but also mixing up information or having false memories. Many of my memories have nothing to anchor on.

I think the trend today is to prime the brain with content depending on the context. E.g. before giving a talk, an engineering meeting or leading a training session you can use flash cards to warm up the cache and strategically dump what is not important to remember.

Not exactly sure how I feel about that.


I've noticed a few ways in which this overload of information manifests.

You've got the people who consume a lot of "information" yet can't make much sense of it, or at least in a way that lines up to shared reality. I'd say this represents the average person. The main coping mechanism for these people is to consume information as entertainment and otherwise not think about what they're taking in. Otherwise, they might subscribe to prefab reality "lenses" that effectively give them the orders they need to make executive decisions in life.

Then you have those who actually can remember lots of trivial knowledge, but can hardly think beyond the level of factoids. In other words, they think that the world can be explained by what's directly in front of them, ignoring the need to distill, synthesize, and extrapolate in order to make predictive models of the world. I know a few intellectuals that insist on this thought process, and their predictions are usually wrong, yet they don't adjust their belief that memorizing a bunch of facts makes them more accurate thinkers. Likewise to the common person, the trivial knowledge archetype sometimes subscribes to existing world views so they can cherry pick knowledge that fits those views, mistakenly believing that their views are original and not assigned to them.

There's also the opposite of the last archetype, which is the overly abstract thinker who can't remember many specific facts at all, so they cope by passively consuming large amounts of "data" and distilling it down into a models of the world that make sense to them. This is the camp that I fall into. It's not that I don't remember anything specific, but individual factoids must be of significant interest for me to commit them to concrete memory. Even if my models of reality don't line up on a factual basis, the more important thing to me is whether I get results. The problem with people like I am is that we can think in terms of big picture but sometimes fail when thinking in a micro-scale actually counts for something. This becomes even worse when there is too much information to consume, because any bit of compelling data causes the distilled reality models to expand in ways that might not be justified.


>Even if my models of reality don't line up on a factual basis, the more important thing to me is whether I get results.

Without specific memories, how can you tell whether or not you have been getting results? :)


It's up to you. :) If you're getting what you want out of life, or if your predictions are usually accurate, in spite of having a mental model that isn't technically correct, then that would mean having results that didn't come from excessive rumination or memorizing lots of units of specific information. Similarly, religion can lead a person to perform actions in a way that are of benefit even if the beliefs instilled upon them are factual nonsense. What I'm saying is that a person can do just that but with high level conceptual structures as opposed to either religion or pedantic data hoarding.

When you keep trying to do right, but your world is perpetually on fire, you probably aren't getting good results.


Interestingly, this can be seen as an extension of the same principle; in a given person's life, there are impossibly many events going on all the time, each of which is providing some benefit, or negative result, and you could if you choose, try to look for specific memories of achievements or failures in order to benchmark your life's progress, falling back onto them and recalling these specific moments.

The obvious problem with such events is that they may not be representative, and so like a gambler remembering the last few wins, you could keep trying to solve a problem.

Conversely, you could try to remember conclusions, and a few simple procedures, while also passively using a diary or data entry system, such that you have a general feel of "how things have been going lately", without any specific examples, and then occasionally rerun your procedure, taking stock of recent events from recorded data, and update your abstract value.

Then there's the hybrid approach; working on a dataset, find a few specific data points that most properly represent the diversity of your current experience, then remember those, the general feeling associated with them, and your procedure for updating them.

That way you use your emotional episodic memory, but tie it to things that are verified by more careful reflective analysis.


Hello from another of the last archetype.

It's a blessing and a curse. And one I find very few careers favour.


I also think we consume more information that we need. I think we also tend to overvalue getting that information and retaining it. The HN crowd is biased towards analysis and information, but just look at the abundance of note-taking systems posted.

I used to store as much info as I could somewhere (a personal wiki) but over the years I realized there's just too much there and most of it I never need, and if I do need something, I can look it up again anyway.

I think it's possible to become something of an information pack rat. It's true that for me learning something new feels productive somehow, but if I were to be honest, most of the knowledge I seek out online doesn't really provide me with direct value. The act of seeking it out as well is time that could've been used to do something else.


The fetishism with notetaking is rampant, both on here and online. People make good money writing articles about the latest notetaking app and serving ads. Personally, I've never needed to crack open an old notebook after the relevant project has passed. I have a stack of them from college at my parent's house and they are simply collecting dust. Even if I had everything digital I would never feel the need to go back and sift through them since they are no longer relevant.

The act of taking notes with a pen and paper, to synthesize thoughts into symbols and sentence structures, is far more meaningful than looking at the note again imo.


> The act of taking notes with a pen and paper, to synthesize thoughts into symbols and sentence structures, is far more meaningful than looking at the note again imo.

Absolutely this. I'm working on teaching myself abstract math with help from a friend on Discord who has his PhD and is willing to check my proofs and such, and just the act of writing things down helps so much. Especially linear algebra, where I kept losing track of all the summations and what stood for what while I was just reading. Writing down the stuff, and adding my own annotations to the steps explicitly elucidating why a step was possible, did more for me than ever reading back over them did, though I do admit sometimes it helped if it was a fresh concept; but oftentimes, it was just the act of writing that did it for me.


That's how I treat note-taking today. I keep a physical notebook that I jot down ideas and draw diagrams in. It's sort of my canvas for sketching things out and seeing if I understand the problem and solution I'm working on.

Like you, I have a bunch of old notebooks as well that I only ever flip through for nostalgia. I generally only review my notebooks for design notes if the notes are at most 3-6 months old.


>The act of taking notes with a pen and paper, to synthesize thoughts into symbols and sentence structures, is far more meaningful

Well Put!


Ditto.

This reminds me of write-up by Morgan Housel: long-term knowledge vs expiring knowledge[0]. To optimize that trade-off, I generally try to review, on a weekly basis, what I consume from the internet-verse. This has helped me in some ways. Not sure if it's gonna work or not for other people... Plus, moving the scale towards consuming long-forms is also helping me out. (I guess it depends on what type of bubble you are wrapping yourself into. For instance, only visiting specific sub-reddit and LW topics intentionally has been advantageous to me...)...

[0] - https://www.collaborativefund.com/blog/expiring-vs-lt-knowle...


Agree 100% — nicely written up!


Some of my most vivid childhood memories are dreams I had. When recalling the events in my life, it's not always easy to recall which were dreams and which really happened. It's terrifying.

It's all too much information. I worked on cdrom technology 20 years ago. I don't want to know the sector size is 2532 bytes, but I do.


it's weird how dreams and "real" memories can blend that way. last week I was half-awake half-asleep during a rain storm and I felt water dripping on my forehead from the ceiling. it felt quite real, but out of laziness I decided to just deal with it in the morning and go back to sleep. when I woke up, my bed was dry and there was no evidence of a leak anywhere. I'm still not quite sure whether it was real or imagined. I'm assuming it was just a dream, but usually I don't have such vivid physical sensations in dreams.


You just made me realize how confusing it must be for our brain to receive a high density of pointless information projected into it when we binge-watch a TV show through a weekend.


After a week of work I cleanse my brain watching golf. Dont really like it but I sleep better than watching basketball or football.


Reddit, and by extension HN, is by and large trivia porn.


Humans spent thousands of years telling stories orally, we didnt even have a written language for much of our existence, yet our memories have been working pretty well these past few millenia


There is some research to suggest memories are stickier when tied to locations or other physical things.


Whenever I do a SEO workshop I ask 3 questions to my audience:

  * How often did you Google this week?
  * What was the last thing before the last thing that you Googled?
For the first question, I get laughs. For the second one blank stares.

Later, I shoot another question:

  * What is the last thing you read, that you googled before, top to bottom?
For the last question everybody looks at their feet.

The use case I optimize websites for is:

  * Users do not know when they Google.
  * Users do not know what they Google.
  * Google is an extension of their thinking.
  * They do not read what they find. 
  * It just need to move them forward in a way.
We are all users.


40 years ago it would have been:

    * How often did you read articles in newspapers this week?
    * What was the last article before the last article that you read?
    * What is the last newspaper article that you read, top to bottom?
Or, for tech people:

    * How often did you read manpages this week?
    * What was the last manpage before the last manpage that you read?
    * What is the last manpage that you read, top to bottom?
We remember things differently depending on what the purpose of the information is, and that hasn't changed. There's absolutely no reason to shame people into "looking at their feet" over it. Throughout the ages, countless people have read the tragedy of Romeo and Juliet, but very few people beyond actors who have actually performed it would know who said this, or even in what act it was said:

    Both by myself and many other friends:
    But he, his own affections' counsellor,
    Is to himself--I will not say how true--
    But to himself so secret and so close,
    So far from sounding and discovery,
    As is the bud bit with an envious worm,
    Ere he can spread his sweet leaves to the air,
    Or dedicate his beauty to the sun.
    Could we but learn from whence his sorrows grow.
    We would as willingly give cure as know.


If you ask me to remember the last but one thing I said to someone I live with in my house, or the last thing they said to me, I wouldn't remember that either.

This doesn't mean I don't remember them, but my memories of these events are not structured as a temporally ordered list.

I can say with high likelihood some of the things I googled recently, and a good selection of the reasons I was googling, just as I can talk about things I discussed recently, even if I can't index them, but expecting that to be addressed in a sorted list is to confuse the simplicity of formulation of the question with simplicity of implementation.


There's are two serious differences between 40 years ago and now.

1) I can look up a thing I can't remember (or don't know) both specifically and immediately. 40 years ago this might have taken a week and an inter-library loan. I'd certainly be sure to remember what I'd learned afterwards. And,

2) Things that I once looked up can be silently changed or removed the next time I try to look them up.

I don't think it's useful to handwave the material differences away with a general "People always say stuff about everything."


Why is it important to remember everything that you've read especially in this day and age? 90% of stuff you Google is just to help you get a small task done or learn something that seemed important in the moment. Another 9% might be something you write down in a notebook to look back at later. Very few things are worth actively keeping track of in your mind and those are usually fundamental building blocks of a new mental model.

The brain is for thinking, not for remembering.


> The brain is for thinking, not for remembering.

An observation I used to see with more frequency, is that a part of thinking depends on noticing patterns and connections between things. If you don't have any working or long-terms storage, this is impossible. Developing new insights will depend on internalizing at least some amount of information.

The googling to complete some small task (e.g. what was the second parameter's type?) -- I'm not sure I'd really consider that thinking, per se. It is definitely something that contributes to our general cognitive load, and the ubiquity of search surely is a welcome aid, as you suggest; there really are too many unimportant things to keep in our heads.


>The brain is for thinking, not for remembering.

I mean, millions of years of evolution would disagree. Seems pretty shortsighted to declare memory useless.


I suppose you could say that it's not useless, but it's use is subsidiary to the primary purpose of thinking, because it is thinking that actually solves problems.

This isn't necessarily true however, on a group selective level, remembering allows you to more effectively pass information to future generations, allowing your community to continue to adapt socially to your environment.

Sheep do this, with young sheep being explicitly taught how to handle a landscape by their parents, which is normally understood as showing them how to walk etc. but extends all the way to navigation and safe routes at least.

In our context that connection is far more obvious, even if that purpose has been in many ways assumed by tools; even if your brain remembers things solely for the purpose of thinking, it may not be for your thinking, and so we can say that memory, on an individual level, has a separate function.


That reminds me of Joel Spolsky's book "User Interface Design for Programmers", in particular chalter 9, 10, and 11 which are titled "People Can't Read", "People Can't Control The Mouse", and "People Can't Remember".


Can be argued that Googling is simply unimportant. It is not an extension of a thought process (as pouring a cup of tea is not an extension of our digestive process), it's simply something trivial that we do and immediately discard the memories of that as of having no importance for us. Humans are - I believe - goal-oriented, so whenever we want to discover something that's our goal, not the process that led us there (unless something memorable happens along that process).

IMHO, what we remember is the task and its outcome, not the trivialities that brought us from one to another. Which makes me wonder why the last question puzzles people.

I don't readily remember my search history, because I just discard the memories of reaching for something (unless there was something exceptional, which is rare). It's like driving to the restaurant, when there are multiple ways to get to the place. I do remember the outcome (dinner) but unless something happened on the road I probably would need to think/reconstruct if I'd be suddenly asked about the particular roads I took.

Same with the articles - I do remember what I was interested in today (as opposed to "what I've searched for", a subtle difference), and what were the results - things I've read top-to-bottom. Although maybe not in true historical order, but typically the last one is still correct.

To summarize:

- "How often" - unimportant, as the count rarely has a value. Need to enumerate and count things, and that's quite a tedious mental process.

- "What was the last thing before the last" - typically unimportant, as historical order of unrelated events rarely has a value. Needs some mental processing to sort things.

- "What is the last thing you read" - beats me, I do remember what I've read. May need some mental processing (going through my "reading history" and confirming that I've reached this by searching), but besides a possible surprise factor not really a puzzling question.


Sounds not necessarily bad were Google something open source or on-premise hosted.


> So I wouldn’t say we are losing our ability to remember, as I posed at the start of this post. I think people (me included) just don’t do enough work to move stuff from our working memory into our long-term memory.

I'd say we could be better at remembering that some piece of information exists and where to find it instead of having to memorize it. This seems more powerful.

And for things you do often, you will probably memorize it anyway.

Lack of focus leading to not remembering the content of a meeting is problematic though. I don't have this issue fortunately.


Yep, I've found I'm less effective at remembering fully detailed information but more effective at remembering 'bread crumbs' which let me quickly find the information in documentation or online. It's sort of like I store the index internally and the data table externally.


This is also something I've found when hiring. If the position is time sensitive - the fires being put out tend to have a <1 day time limit, I want an older (45+) person in that position. Not for their experience, but for their ability to recall and store memories.

In my experience, older employees are able to recall information to solve a problem, but the recall may be incomplete, leading to a quicker, but less effective solution. Whereas, younger employees often need more time because they don't specifically remember solutions, but they are able to find, categorize, and process information faster, often leading to a slower, but more complete and robust solution.

Not sure if it's a product of education and upbringing in different worlds, or a product of experience, but it's fascinating to me.


I don't think that is unusual.

I also don't try hard to memorise things I know I can trivially lookup.

Other things I have a directory called `useful_things` that has markdown files broken down by category I can quickly grep for that thing I remember I needed but not how to do.


Thank you for saying this - that is what I experience (sub thirty) and I never thought of it that way...

It even comes to inbox organization. All the older team members her have folders etc to organize everything. The younger ones - we have one large inbox with everything and just search by remembering how to look for it (“oh yeah, that email had the word “altruistic” in it and it Jeff was involved)

If you think about it, that’s just how Google and constant internet connection programmed us... knowing how to find information became more valuable than knowing information.


> more effective at remembering 'bread crumbs'

Fun fact: this is Ken Jennings method for practicing for Jeopardy, mental models of items and triggers with surrounding facts.


> I'd say we could be better at remembering that some piece of information exists and where to find it instead of having to memorize it. This seems more powerful.

[paraphrase] "I wrote it down so I don't HAVE to remember." --quote from Indiana Jones & The Last Crusade

Seems similar to Einstein's "Never memorize what you can look up.".


> "Never memorize what you can look up."

I can look up all the definitions and grammar rules of English, but without memorizing them, I wouldn't be able to communicate with anyone around me.


I don't know any of the definitions, grammar rules of English yet I'm still able to effectively communicate with and understand others.


If you are a native speaker of English, you learned those rules over your whole childhood, by example instead of being explicitly taught the rules.


Agreed - I learned examples, not rules.


> I'd say we could be better at remembering that some piece of information exists and where to find it instead of having to memorize it

This approach is called Transactive Memory, and you do it with Google, with your note-taking software, with your friends and colleagues. You do it with your pet.

One of our biggest employable strengths as hackers is that we know where to find information. We make a habit of learning where to find different kinds of knowledge, then do a deep dive into a particular subject. We are masters of transactive memory.

We as a species are rapidly shifting to a more transactive memory in general as it further compresses our knowledge into a small space by storing metadata instead of the knowledge itself, allowing for rapid acclimation to a given task based on the wealth of knowledge around you.

https://en.wikipedia.org/wiki/Transactive_memory


Sometimes not remembering the content of a meeting has to do with no important content in a meeting.


you just described pointers and RAM :-)


Wasn’t there a Greek philosopher who lamented how the youngs are losing their ability to remember because this newfangled invention called “writing” is making them lazy?


Plato's Phaedrus (c. 360 B.C.E.)

> ... If men learn [writing], it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written ...

http://www.umich.edu/~lsarth/filecabinet/PlatoOnWriting.html


Yes. For those who haven't come across this factoid, it actually wasn't Plato, but Socrates who famously lamented the harmful effects of writing on memory and teaching. Socrates was a big fan of knowledge transmission through dialogue and discourse.

Socrates never wrote anything down, but Plato did, so ironically now we know about Socrates' disdain of writing through Plato's writings (in this case, the Phaedrus). Quoting a paraphrase from Wikipedia [1]

"... writing can do little but remind those who already know. Unlike dialectic and rhetoric, writing cannot be tailored to specific situations or students; the writer does not have the luxury of examining his reader's soul in order to determine the proper way to persuade. When attacked it cannot defend itself, and is unable to answer questions or refute criticism."

[1] https://en.wikipedia.org/wiki/Phaedrus_(dialogue)#Rhetoric,_...


It's really not obvious how much of ‘Socrates’ in Plato is actually Socrates and how much is Plato trading on Socrates' name.


True, what Socrates is purported to have said in the Phaedrus could well be a concoction of Plato's -- we'll never know for sure.

However we do know at least by virtue of the existence of the Phadrus that Plato himself did not subscribe to the position attributed to Socrates in the Phaedrus.


Socrates being probably the most quoted person of all time who never wrote anything (giving Homer the benefit of doubt since he did leave a couple of epic poems).


The irony of reading this, over two thousand years later, translated into a different language, presumably by someone who is, even after all this time, still able to read the original--it's almost like writing helps us remember, as a species, in ways that would have otherwise been almost completely impossible.


Yes - and writing is a way to extend your memory as well. You can work more complex ideas with yourself, using the paper as storage for what you have already thought through.

Can you imagine doing some even simple Calc-type proofs without paper? Not possible. Many things are like that.

You give up maybe better memory in your head in exchange for being able to make progress that your couldn’t have made without the tool of writing.

Humans are ultimately tool-using animals more or less... makes sense that we use tools that are valuable.


If there's one thing technology is good at disrupting, it's tradition


I'd say the opposite. The fact that a species memory has been created by the written word and the printing press means that nothing ever dies, and initial mistakes and the achievement of local maxima can be preserved forever. The tyranny of Aristotle over the Dark Ages was no joke.

Natural senility as an individual is probably no more than the accumulation of calcified habits, rather than completely biological - the ability to abandon old, wrong knowledge is a sign of youth and indispensable to the learning process. The written word has enabled us to achieve senility as a species.


If only they could solve a simple challenge, like summing all letter codes (and one free letter) in a particular way, to find a number with least significance, and the next document would include it. Then tampering with history would be infeasible and easy to check.


In a way Socrates was exactly right. Consider Buddhism, a philosophy that lasted orally for hundreds of years before the first texts were written. Even today, buddhist monks use chanting and repetition to memorize these teachings, internalize them, and live them.

Sure, you could read these teachings and move on with your life, but can you recite what you just read, or even what you just wrote? Not a chance. Your memory of text is fleeting compared to if you gave focused mental effort committing that text to memory through oral repetition.


Yes but its us that live in era of weaponized attention grabbing.

People are being paid millions to figure out how to hijack your attention, how to use yourself (body, mind functions) against you.

Writing is a tool, it doesn't try to work against you, or have an agenda that might not be aligned with yours.


Greek oration was all about weaponized attention grabbing and getting people to join your school not those other terrible terrible oh so bad schools.

We’ve had weaponized attention grabbing as long as wemve had humans I bet.


It was an art of oration.

Not a science driven exploitation of basic human instincts.

They didn't have a focus groups A-B testing and neurological studies back then.

That's a bit like saying that the impact on climate change back 1000 years ago and now is the same, as in both cases we are releasing carbon into atmosphere.


It was the closest they had to a science, and it was aimed at manipulating the listener in a predetermined way.

The fact that it was mostly applied at civil politics tells more about their society than about the tool...


Writing is a technology. It's disputable if any technology is inherently neutral or not:

“One of the most dangerous things you can believe in this world is that technology is neutral.”

from https://thecompassmagazine.com/blog/is-technology-morally-ne...


True but slippery slope and all. We need some basic ability lest we devolve control to things not us. We should still be able to get by without technology in emergency situations (natural events causing power outages, etc). We should not be rendered helpless.


Remember that fire was also a technology at one point. Our species is sustained by technology and there's no going back. There's always going to be some baseline we can't live without.


How do you know he wasn’t right and writing indeed worsened our memory? Maybe you’d remember who it was if not for knowing it is written down somewhere.


And that's the question.

Is there more value in being able to remember who the specific philosopher was, or more in knowing how to find that information quickly and easily?

To take it a step further - is there more value in individuals remembering the specific philosopher, or in society, anyone in society being able to find out that same information?


Memory is fundamentally limited anyway. Unless you're going to set up an Anki deck for your Zoom meeting notes, it's unrealistic to try to remember action points from every Zoom call we're on. The brain is an efficient machine, and it will naturally remember things that come up often enough. It works quite well on its own. I don't think I've ever sat down to try to remember how to write a for loop in whatever my preferred coding language is at the time - you just remember it after a few semi-regular lookups.

If you feel your memory is limited in an area you do value instant recall that doesn't inherently produce regular repetition, there are ways to steer your long term memory consolidation. For example, you can train yourself to remember everyone's names when you meet them, if you value that. If you don't value it or put any particular effort into training it, there's no reason to think you've gone senile if you forget the name of most people you meet the first few times.

For most things, I think the second brain solution is ideal. You value something enough to want to be able to recall it at a moment's notice, but you don't have any real need to instantly recall it without reference. We're not taking closed book exams outside of school [1]. This is where all the Zoom notes and book quotes are placed, where you can further digest, interpret and later recall them if and when they become relevant.

[1] https://fortelabs.co/blog/knowledge-building-blocks-the-new-...


Awesome write-up!


I've been wondering the same thing recently. If anyone's interested in learning more about how previous cultures remembered vast amounts of information I would recommend the book I'm currently reading:

Frances Yates - The Art of Memory,

and the one I'm planning to read next:

Mary Carruthers - The Book of Memory.

One of the most interesting aspects for me is how the concept of memory in the middle ages was much more closely associated (sometimes conflated) with imagination than it is now. Most people I know now would consider memory and imagination as two quite distinct mental faculties.


Another good one is called “Moonwalking with Einstein.” Very interesting and entertaining, I highly recommend it.


For anybody interested in diving deeper into this subject, I would recommend reading The Shallows: What the Internet Is Doing to Our Brains. It was written in 2011, but is still very applicable today. Brain plasticity is a real concept and our constant connection to the Internet affects us.

https://www.amazon.com/Shallows-What-Internet-Doing-Brains/d...


Very scary and I do get a sense that my recall ability and general focus is severely diminished after a decade of being active on reddit and other places like here. The problem is I'm on this dopamine treadmill. I see a thread with an interesting topic and a lot of comments, and I want to dive in and spend an hour sifting through it. I check my subreddits and HN every day.

And for what? I get nothing out of it, maybe a little pissed off every now and then. Most articles I read are forgettable and have no impact on my life. There is literally nothing that I've learned on the internet over close to two decades of use that I couldn't learn more thoroughly with a book or by looking at a newspaper, and in recent years it's been getting impossible to find actual factual subject information on the internet that isn't some half assed SEOified article that doesn't even have my answer.

I think I'm just gonna pull the plug and block these websites with some browser extension, because I am just too compulsive at this point, too engrained into this automatic unconscious action of cmd+t->news.y->tab->enter to do it under my own volition. Sometimes I sit down and I don't even remember opening HN, but there I am scrolling through the front page.

I still need the internet for emails, stack overflow, or finding articles in my field, but that sort of use I will allow because I take that information and synthesize it into something novel and useful. I'm going to pull the plug on being a mindless internet consumer, who leaves no time for actual thinking in between the rampant consumption.

Good bye, HN, hopefully for good but we will see how long I'm able to remain disciplined. I'm kinda excited about what sort of mental clarity this might bring me and how much that will improve my life.


Good luck to you! I feel all of the sentiments that you mentioned in your comment. Here are some tips for you that have really helped me:

- Turn off auto-complete on your desktop browser. The "cmd+t > news.y" or "cmt+t > r" is a real catalyst for mindless browsing sites like Hacker News and Reddit.

- Uninstall apps like Reddit, Facebook, or Instagram on your phone. If you really want to access them, then use the web experience through the mobile browser. It's enough to get the job done, but not good enough to be addicting (no auto-play videos, notifications, and the UX is slightly degraded).

- Turn off all non-pertinent notifications on your phone, especially things like news, email, and social media. The only daily apps which have I notifications enabled for me are Messages, a sports app, and daily habit reminders.

- If I feel like I'm using a social media site too much on a laptop, then I change the hosts file to re-direct to localhost. Bam, access revoked for a while. I've found that this works better than browser extensions. With extensions, I used to just right-click > disable, then go to my time-wasting website. With the hosts file method, I need to figure out the path to the hosts file (I never remember it), open it in a text editor, type in my changes, then save the file with sudo permissions. I thought about scripting it, but I think the manual process is more effective at preventing me from constantly enabling/disabling access. There's more intention behind the action.

- Pay for a newspaper subscription. It's so refreshing to consume quality journalism vs trendy click-bait articles. My recommendation would be read one national outlet (NY Times, Wall Street Journal) and your local newspaper. Instead of browsing Reddit/HN in the morning, open up your newspaper app.


Second this recommendation, I enjoyed that book very much. However, I felt defeated by the end, because I couldn't act on the presented information in any meaningful way.

Halfway through the book the author tells about how much he had struggled to write and had to isolate himself from technology for a while, but only temporary. He then came back and felt the "shallows" once again.


They did a good follow-up with the author on the Ezra Klein Show podcast a few months back: https://www.vox.com/podcasts/2020/7/1/21308153/the-ezra-klei...


Oh, this is great, thank you.


During the pandemic, spending more time at home and less time having real social interactions has lead me to beleive I am forgetting some of my vocabulary or at the least not practicing it enough to keep it in what you describe as "Working memory".

Zoom and the like is far from natural and I find myself searching for words during sentences that I know I would not have in the past.


Can you expand on that? And can you describe your living circumstances. I'm relatively isolated due to rural life, and I haven't seen any of that in my life. I tend toward isolation naturally, so maybe that has something to do with it.

But I'm fascinated by your statements. Can you go more in-depth, please?


Before the pandemic, I was living in London shared flat with my partner and a friend of ours. Had quite a busy social life, working in an office in Soho.

Since the pandemic I have moved out of the city and into a more rural isolated area (Did not see any benefit to paying high rent prices in a city when all the facilities were shut down). All of my work has moved to remote working using video conferencing etc.

Before we would use Slack a lot, but in-person meetings were a common occurence and I would spend quite a lot of time outside of work with friends or colleagues discussing various topics.

Since being in this isolated environemnt I feel that because those interactions are far more rare when they do happen I struggle to recall words or phrases that were once commonplace in my vernacular.

This is all very anecdotal evidence of course, but it's somethig i've observed in myself of a number of occasions now.


I pre-date Google and whenever I had technical questions the only way I could get them answered was to explain to the person I was asking exactly what I was doing and why, and they would only answer me if they agreed 100% with what and how I was doing something. If they didn't like either then I didn't get my question answered. This forced me to have to go learn and memorize everything about all aspects of technology just so I didn't get roadblocked by someone having a bad day. Today Google is more of a convenient way to jog my memory - I can function just fine when the internet goes out, read a map if my GPS loses its signal, etc. I don't ever want to be one of those blank people who just stand there and stares when the screen blinks out.


I honestly fear what will happen when we attach more directly to the brain/eye the type of infrastructure and capabilities of a modern smartphone / search engine / assistant device that we now carry in our hands. Like Google Glass or whatever, but more invisible and immersive. I already feel so distracted; memory and focus and analysis sideloaded onto devices, and I'm not even a big phone user.

Kids in school are already struggling with these things. What happens when every child is walking around with facts available instantly and constantly, but no context to manage it?


Considering how many children and adults have undiagnosed ADHD, I wouldn't be surprised in the slightest that advertisers are exploiting these pathologies to great effect. Sidetrack and distract to draw eyes to your product, and with smart glasses you always have that opportunity to introduce an advertisement or track gaze on advertisements ("please watch the ad to continue playing the video!" and pausing the ad when you look away is incoming, I assure you).

Personally that sort of stuff is where I draw the line, since it's a redundant consumer product with things I already own. Technology in recent decades has shown that the only utility it provides is analogous to something that already exists and works fine, because companies would rather ship something fast and 'new' than something clever that took time to think about. Like the smartwatch: $300 to poorly replicate half the things your phone in your pocket can already do, oh and you have to charge it every day unlike your automatic watch that was powered by your moving wrist alone. Or smart glasses, which would only serve to distract me in the middle of whatever I was doing at the time (probably with increasingly intrusive advertising like we see in every piece of technology in recent decades), and once again, is entirely redundant with 1/10th of the functionality that my phone in my pocket can do (including AR).


Exactly, I agree. Think about 100 years from now — our bodies won't have evolved quick enough (much like our fight or flight mechanism dealing with the stresses of today) to be able to cope.


I second, second brain. heh. I invested like 20 hours setting up my notion just right, and it's so worth it. It's so much more organize than my mucky/intangible brain. I also believe in the GTD philosophy that your brain is for having ideas, not storing them.


I used to think this way at the beginning of my career, and I've almost completely changed course since then. I found that organizing and storing information is somewhat overrated unless you are very, very good at knowing ahead of time whether information you have come across will be useful in the future. If you don't, then it's better to just let your brain do the means-testing there naturally -- does something make the cut and you keep coming back to it month after month, year after year? Good news, you'll probably commit it to memory as your brain needs to do so.

The problem is that if you overindex on what you choose to organize, you just end up with a bunch of junk and then you have to go back and tend it and delete it and figure out what makes the cut or not. It's such a time intensive chore that takes away useful cycles that I'd rather be spending on deep work. My deep work is rarely information retrieval and organization, but instead handling strategic concerns in the moment and planning for the future -- both of which I need to balance and do with proper judgment.

And unfortunately, I do not find setting up a knowledge management system very useful for that. An old-fashioned journal that lets me get out my thoughts in the moment and log it at a point in time is really the best tool for that.


That's a very valid point. I used to call writing in Evernote, writing into the void.

Though the beauty of any system is you design it. You get some kind of say in what's important. I trust that more than randomly depending on your brain. The brain can be a weird place. I have embarrassing memories from my childhood, I don't know if it serves me anymore, but they're there.

Also, I get distracted too easily, having a list of tasks I'm working on keeps me on track.


Let me tell you an anecdote about my personal experience. Let me know if any of this sounds familiar. As the years went by, I began to realize that a lot of the managers/mentors I had worked for and looked up to when I was younger simply did not know what they were talking about. You're too easily distracted -- focus, pick yourself up by your bootstraps, they would say! But they would never talk about how the left hand of the organization and the right hand of the organization didn't agree. And I began to wonder if they were making things up in order to deflect blame away from the leadership structure which included themselves.

Of course, that sinking suspicion turned out to be generally true. When I worked at hypergrowth companies with great leaders (rare to find and whom I must say I somewhat took for granted), guess how often I found it hard to focus? Very rarely. Now, what about all the other places with organizational dysfunction? Oh, it was VERY hard for me to focus. Borderline impossible, really.

It turns out that my to-do lists, my "productivity hacks" my treasured knowledge bases -- all of them were a very specific kind of procrastination I was doing to maintain some semblance of control trying to get work done in an organization that made questionable decisions over my head. I was trying to avoid having the hard conversations with the people who made those decisions because they were stressful and risky and if they went the wrong way I would have to leave. Which I did on more than one occasion.

But since that point, my brain went from being "a weird place" as you phrase it, to being a place that works on my behalf. Let me tell you, it took a lot more work, but it's a lot more convenient this way. You spend a lot of time with your brain -- 8-16 hours of conscious time every day of every year. If it's not playing nicely with you, it might be worth investing in turning it around to work on your behalf. It can take a while (probably 6 years on my part), but I assure you that it's worth the effort. Some of that will pertain just to you in particular -- exercise, seeing a therapist, yoga, mindfulness training, what have you. But some of that will be directly changing your environment.

Changing your environment means working in an environment where you aren't forced to be distracted. Many jobs page you with BS at odd intervals and make it hard to engage in deep focus -- that, or you're turned into a ticket jockey with very little time or agency to think deeply about how to solve problems. Alternatively, your living environment can contribute to this -- are you stuck with housemates or a family which is constantly making it impossible for you to focus? If so, it will be hard to focus no matter what /you/ do until you take out the environmental aggressor.

And beyond that, if you're going to use a tool to help you focus, I'd recommend a schedule builder [1] rather than a to-do list. To-do lists often feel a lot more helpful than they actually are -- it's not a good thing that it feels so "fun" to check off an item. Compartmentalizing your day into intervals which you use to focus on various kinds of work is the ideal because it makes it easier to build the habits you want to build. Habit formation is the key to long term success at just about any pursuit in life.

https://www.nirandfar.com/todo-vs-schedule-builder/


totally agree, this is my notion homepage and it's reduced A TON of stress:

https://imgur.com/a/qTsL3l3


I strongly suspect the article's title should replace "we" with "I" and ought to be an introspective on the process of aging.


Actually a good point!


"Something had to be wrong! I started to notice (increasingly!) my inability to recall trivial things; for example, the action points from a Zoom call, or a quote from a book that I had read a couple of months ago. Surely this can’t be normal?"

Except it's fairly normal under stress for people's memory to function more poorly and since we're experiencing a Global pandemic that has completely up ended our normal mode of living I would expect baseline stress for almost every one to be up considerably.

So that's my hypothesis, which requires less assumptions than "people are losing the ability to remember because of computers" but doesn't result in a blog post where I can talk about Anki or equivalents.


Haven't we seen this topic dicussed and dismissed before? I could have sworn...


Yes... probably ... what was the question again?

// This loss of ability to remember is real and personal. To combat it, once a year, I undertake a two to three week digital detox. No devices, no media, only long form traditional books. Takes a week to overcome agitation from not being able to “consume” digital micro-info-bursts on demand. Following that, my brain begins to restore its ability to build and maintain concepts, built up like Jenga towers or houses of cards while reading. That ability remains until I get lazy, quit taking notes by hand and go back to digital.


Once a year is not going to help much, I don’t think.

It’s something you have to exercise with some consistency to keep from atrophying.


Don't know about memory, but without doubt, the web defines our access to information, during the pandemic more than ever. With far-reaching consequences for politics, cognitive abilities, history, power balances, market access, etc. Which is why it's so enormously important to not let us slide control over the language(s) of the web, such as HTML and CSS out of our hands, but retain or regain participation and representation in web standards.


There's a further separation here - let's call them "conscious" and "recognization" and they are processed by two different parts of the brain. I'm doing a lot of generalizing in this explanation and I'm not attempting to be 100% accurate in where things happen - I'm a programmer/architect, not a brain guy.

And it's not just memory, but the processing or "thinking" of that memory. Let's take just one sense, sight, for our example.

When you see something, eventually the signal gets to your "conscious" mind (also, there is a slight delay as your brain processes things). But at the same time that information is processed by the "recognization" part of your brain which will eventually store that information, also, in long term memory. There are a few interesting facts about this that I've seen:

1. there are people who are blind, but not because their eyes are damaged, but because the connection is severed to their "conscious". But their "recognization" connection is still intact. Because of this, even though they can't see, they can recognize faces and even stop before running into a wall. This is because the regognization happens even though you're not consciously aware of it. Something like face blindness is (probably) the opposite. 2. When you're in a parking lot looking for your car and you see your friend, you won't recognize them - this is because the "recognization" processing is only single threaded! 3. You can recognize someone you know from much further than you can make out their face consciously. 4. Recognization looks for very complex patterns - especially faces. Just ask yourself "how" you tell one face from another? That feeling of being being watched may be just that there's something face like in the environment. 5. The recognization will store its information and your conscious mind will look for it where its stored but they don't talk directedly!. So, again, when you see something, that information is sent to both places. But sometimes, especially if you have a chemical issue (e.g. serotonin) (even just a minor issue for that moment, not necessarily a big issue), the recognization process will be fast but the conscious process will be slow. So the recognization will store it, then the conscious brain will check to see if it's stored - and it's there! So your con


Great explanation, thanks for taking the time to write it!


Has anyone built something to try to automate setting up spaced repetition?

For example, I've looked up numerous PHP functions on www.php.net, numerous Python functions on docs.python.org, numerous Perl modules on metacpan.org or with the perldoc command, and numerous JavaScript functions on MDN.

It would be neat if there was something that could automatically note what I've looked up, and turn that into flash cards for use with something like Anki.


My brain is basically a cache for pointers at this point. I always know where to find something I'm trying to remember and I always have my phone on me.


Book Suggestion that goes into this problem and offers solutions, I'm listening to the audio version right now: "Think for Yourself" - https://www.amazon.com/Think-Yourself-Restoring-Artificial-I...


Ordered!


I am experiment the same problem. As a software developer, I spend all my day on computer and mobile phone. Recently, I have a the feeling that it take me a while to recall the name of a friend. Or even searching for a solution. It could be the side effect of all of the autocomplete and search result which are accessible within my palm.


This sounds more like aging.


I always liked the idea of a second brain, an exobrain done in any form of technology.

I humbling say my exobrain is Evernote as I throw a lot of thing in there after adding some tags. I'm fast approaching the 3000 item mark, which is interesting because I'm gradually losing the capacity to find things as I can't remember what's in there to be searched/found. Day to day things like receipt photos are in some way at the top of my mind, but unique events from past years may be forever buried in there.

Another evidence: I saved an article from 2017 just to find out days later when I searched for its title that I had actually saved the same article back in 2017.

Well, I can't even imagine how this second brain will be like in 2027.


I read somewhere that before the advent of paper and books, ancient civilisations would keep epic stories in their heads. After we started writing things down in larger volumes, committing entire "books" to memory was not needed, so it became less prevalent and eventually stopped altogether.

I guess smartphones may further reduce what we need to commit to memory, which will probably have some implications in terms of neuro-plasticity - i.e. it may well shrink (or at least change) parts of our brains.


I like the idea he raises of Google or search in general being an extension of our memory.

I think as technology evolves, we may see a better version of this idea. The ultimate version of this would be what Connor Macleod is gifted at the end of Highlander 2.

He has this ability to see people's thoughts and help them work together to solve huge problems in the world.

We have lots of chat apps that give us instant communication, but it is not very organized or easy to search like the idea of direct access to someone's thoughts.


I've absolutely noticed this. I generally don't remember what I read on HN, or in documentation. What I do remember is the fact that I've read it, and a vague sense of where, and hopefully a few keywords to type into Google. I actively try to keep my memory sharp by doing recall exercises, but I think it's a losing battle: when we know we can find something later you either consciously or subconsciously do less effort to remember it.


I have a quirk in my brain that I can't remember strings of numbers if I try to, but if I read it once I'll recall it just fine.

It's like my brain realises I am intending to go back and read the string 2 numbers at a time, over and over again, so it doesn't bother remembering. But if I just read it out once and don't think about it until I need to recall it, my brain has the information available for me.


I used to think I was losing my ability to remember things due to the amount of information I was taking in. I no longer think this is true and was simply my own worry.

I do review certain things I want to remember in my downtime, but I don't think that is somehow inherently changing some larger picture like my information consumption.

Semi-related, do you guys tend to get headaches after long information binges?


Not headaches, but I say that "my brain is tired".


Personal staff and underlings are the solution. And friends and family are a great backup. This system works out well if you do a good job of remembering and meeting their needs. Keep notes on the needs of each person you care about. They will then put up with all kinds of forgetfulness and go out of their way to keep you on track.


Imagine AR memory palaces, and memory boards (like crafty photo boards, or tactile Lukasa[1])? As my memory emphasizes spatial, I'm looking forward to exploring.

[1] https://www.google.com/search?q=lukasa&tbm=isch


Knowledge enormous makes a god of me.

Names, grey deeds, dire events, rebellions,

Majesties, sovereign voices, agonies,

Creations and destroyings all at once

Pour into the wide hollows of my brain

and deify me,

as if some blithe wine or bright elixir

Peerless I had drunk,

and so became immortal.

- Keats, from memory.

edit: Ok, I had to double check and I made a mistake!


Pretty sure the three main factors that decreases your ability to retain information is:

* Lack of sleep * Stress * Age

Usually all three hit at once, as our career peaks in intensity around our late 30s, while also having kids.


Late 30s is most certainly not old.


As someone over 40 I tend to agree, but it's also old enough to notice the effects on memory retention.


I'm not so concerned about memory any more than calculators made us unable to add.

I am concerned that people seem increasingly unable to draw coherent conclusions.


Ironically enough, facebook messenger appears to be blocking this link from being shared, thus harming our collective memory of it.


My memory is better now than ever before. I find it even knows how to forget things I don't care about! Great feature.


Author goes from "I can't remember things" to assuming that humans can't remember things. Not especially compelling. Vaguely suspicious the point of the article is to be a covert ad for Obsidian.


Haha, not at all an advert. Simply saying that I use networked thinking apps (Obsidian, Roam Research, etc.) to help take the load off my working memory.


What were we talking about?


What?


We are really good at adapting. Today I and my phone can do better at remembering stuff I need, than me without phone 10 years ago.

Is this really a problem if you think about it from the standpoint of human evolution?


Well, it could be a problem if we are aiming for human happiness. If there is natural selection going on here, then it would be pushing the people to actually become less able to think for ourselves in terms of creative solutions. Sort of like how bird species on islands sometimes lose the ability to fly.

If that is the case, it may be really hard for us to actually reverse our destructive trend against the natural world and then we are really in trouble.


Peppridge farm remembers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: