10 Matching Annotations
  1. Jun 2024
    1. Testing culture also discourages deep reading, critics say, because it emphasizes close reading of excerpts, for example, to study a particular literary technique, rather than reading entire works.

      Indeed. But testing in general, as it is done currently, in modern formal education, discourages deep learning as opposed to shallow learning.

      Why? Because tests with marks implore students to start learning at max 3 days before the test, thus getting knowledge into short-term memory and not long term memory. Rendering the process of learning virtually useless even though they "pass" the curriculum.

      I know this because I was such a student, and saw it all around me with virtually every other student I met, and I was in HAVO, a level not considered "low".

      It does not help that teachers, or the system, expect students to know how to learn (efficiently) without it ever being taught to them.

      My message to the system: start teaching students how to learn the moment they enter high school

  2. May 2024
    1. Matthew van der Hoorn Yes totally agree but could be used for creating a draft to work with, that's always the angle I try to take buy hear what you are saying Matthew!

      Reply to Nidhi Sachdeva: Nidhi Sachdeva, PhD Just went through the micro-lesson itself. In the context of teachers using to generate instruction examples, I do not argue against that. The teacher does not have to learn the content, or so I hope.

      However, I would argue that the learners themselves should try to come up with examples or analogies, etc. But this depends on the learner's learning skills, which should be taught in schools in the first place.

    2. ***Deep Processing***-> It's important in learning. It's when our brain constructs meaning and says, "Ah, I get it, this makes sense." -> It's when new knowledge establishes connections to your pre-existing knowledge.-> When done well, It's what makes the knowledge easily retrievable when you need it. How do we achieve deep processing in learning? 👉🏽 STORIES, EXPLANATIONS, EXAMPLES, ANALOGIES and more - they all promote deep meaningful processing. 🤔BUT, it's not always easy to come up with stories and examples. It's also time-consuming. You can ask you AI buddies to help with that. We have it now, let's leverage it. Here's a microlesson developed on 7taps Microlearning about this topic.

      Reply to Nidhi Sachdeva: I agree mostly, but I would advice against using AI for this. If your brain is not doing the work (the AI is coming up with the story/analogy) it is much less effective. Dr. Sönke Ahrens already said: "He who does the effort, does the learning."

      I would bet that Cognitive Load Theory also would show that there is much less optimized intrinsic cognitive load (load stemming from the building or automation of cognitive schemas) when another person, or the AI, is thinking of the analogies.


      https://www.linkedin.com/feed/update/urn:li:activity:7199396764536221698/

  3. Jan 2024
    1. Deep processing is the foundation of all learning. It refers to your ability to think about information critically, find relationships, make sense of new information, and organise it into meaningful knowledge in your memory.
  4. Sep 2023
    1. (1:20.00-1:40.00) What he describes is the following: Most of his notes originate from the digital using hypothes.is, where he reads material online and can annotate, highlight, and tag to help future him find the material by tag or bulk digital search. He calls his hypothes.is a commonplace book that is somewhat pre-organized.

      Aldrich continues by explaining that in his commonplace hypothes.is his notes are not interlinked in a Luhmannian Zettelkasten sense, but he "sucks the data" right into Obsidian where he plays around with the content and does some of that interlinking and massage it.

      Then, the best of the best material, or that which he is most interested in working with, writing about, etc., converted into a more Luhmannesque type Zettelkasten where it is much more densely interlinked. He emphasizes that his Luhmann zettelkasten is mostly consisting of his own thoughts and is very well-developed, to the point where he can "take a string of 20 cards and ostensibly it's its own essay and then publish it as a blog post or article."

  5. Aug 2023
    1. The essence for this video is correct; active learning, progressive summarization, deep processing, relational analytical thinking, even evaluative.

      Yet, the implementation is severely lacking; marginalia, text writing, etc.

      Better would be the use of mindmaps or GRINDEmaps. I personally would combine it with the Antinet of course.

      I do like this guy's teaching style though 😂

  6. Jul 2023
    1. We prioritize what we see versus what we hear, why is that? Now, what comes to mind when I say that is when, somebody is saying no, but shaking their head yes. And so we have this disconnect, but we tend to prioritize what the action and not what we're hearing. So something that we visually see instead of what we hear.Speaker 1There isn't a definitive answer on that, but one source of insight on why do we do that, it could be related to the neurological real estate that's taken up by our visual experience. There's far more of our cortex, the outer layer of our brain that responds to visual information than any other form of information

      (13:36) Perhaps this is also why visual information is so useful for learning and cognition (see GRINDE)... Maybe the visual medium should be used more in instruction instead of primarily auditory lectures (do take into account redundancy and other medium effects from CLT though)

  7. Jun 2023
    1. When it comes to thinking, the Zettelkasten solves an important issue which is the problem of scope, which is impossible at the current moment in mindmapping software such as Concepts.

      Mainly, Zettelkasten allows you gain a birds-eye holistic view of a topic, branch, or line of thought, while allowing you to at the same time also gain a microscopic view of an "atomic" idea within that thought-stream, therefore creating virtually infinite zoom-in and zoom-out capability. This is very, very, beneficial to the process of deep thinking and intellectual work.

    1. Recent work in computer vision has shown that common im-age datasets contain a non-trivial amount of near-duplicateimages. For instance CIFAR-10 has 3.3% overlap betweentrain and test images (Barz & Denzler, 2019). This results inan over-reporting of the generalization performance of ma-chine learning systems.

      CIFAR-10 performance results are overestimates since some of the training data is essentially in the test set.

  8. Mar 2021