379 Matching Annotations
  1. Last 7 days
    1. Experts who study the phenomenon say it's due, at least in part, to the widening role of technology.

      Skill-biased technological change is a shift in the production technology that favours skilled over unskilled labour by increasing its relative productivity and, therefore, its relative demand. Traditionally, technical change is viewed as factor-neutral. However, recent technological change has been skill-biased.


      • Violante, G.L. (2016). Skill-Biased Technical Change. In The New Palgrave Dictionary of Economics, 1–6. London: Palgrave Macmillan UK.
      • Siegel, D. S. (1999). Introduction to Skill-Biased Technological Change. In Skill-Biased Technological Change: Evidence from a Firm-Level Survey.
    2. A version of this shift is present in just about any other industry you can name. “As more automation came in, there was more demand on these workers to display social skills. What you now needed was someone who could talk to a customer, who could articulate the problem and problem-solve,” says Raman. But rather than look for candidates with those specific qualifications, “many companies took the easy route of using the four-year college degree as a proxy: ‘I know if they have a degree, they’ll be able to use an iPad. They’ll be able to use Excel’.”

      See Wiblin, R. (n.d.). Economist Bryan Caplan thinks education is mostly pointless showing off. We test the strength of his case. Retrieved March 5, 2021, from https://80000hours.org/podcast/episodes/bryan-caplan-case-for-and-against-education/

    1. If the exam is designed in a way that accounts for collegial discourse and use of resources that problem is solved

      I agree. We change what it means "to cheat".

      I always thought it was odd that you can walk past 2 or 3 healthcare professionals talking about a difficult clinical case and you think, "What a great example of professional development and collegiality". But when we see students doing the same thing, we call it cheating.

    1. if a student has nowhere in their university assessment a chance to recognise their work as socially useful, and to see others recognise it as such, then this is a very diminished educational experience

      Again, see the work of Freire and Giroux for more insight.

    2. I extend the notion of student involvement to think in terms of the whole student: to think philosophically about the relationship between assessment and students’ self-realisation. The issue thus becomes, the extent to which assessment is involved in our students: how it contributes, or not, to their wellbeing, personal and intellectual growth and their development as constructive members of society.

      I'm increasingly drawn to the deep insights of Paulo Freire (Pedagogy of the Oppressed) and Henry Giroux (On Critical Pedagogy), among many others. There is a strong body of literature that presents learning and teaching as a practice that revolves around relationship.

    3. students must understand an assessment system if they are expected to flourish within it

      Think about professional development where it would be obscene to tell someone that they're not allowed to know anything about the process of development.

    4. considering the possibility that a different assessment method may have enabled a better engagement with knowledge.

      Maybe students do better in these assessments because we're getting out of their way and giving them space to learn.

    5. many universities have pursued technocratic solutions that reproduce the trusted orthodoxy of the time-limited, unseen exam as closely as possible.

      Nice presentation by Jesse Stommel on the rise of surveillance: Stommel, J. (2020). Against Surveillance. https://www.beautiful.ai/player/-MNUceT2mRb7ZhRJ_hL7/Against-Surveillance

    6. but fundamental principles not rethought.

      An example of what I thought was a rethinking of assessment: Killam, L. (2020, April 6). Exam Design: Promoting Integrity Through Trust and Flexibility. Insights from Nurse Killam. http://insights.nursekillam.com/reflect/exam-design/

    7. The assessment challenges induced by Covid-19 opened many possibilities to fundamentally rethink why and how we assess, but I see little evidence of this actually happening.

      I think most people are hoping for things to end so that they can just go back to "normal".

    1. we reward people who solve problems while ignoring those who prevent them in the first place

      It's hard to know when a problem has been prevented.

  2. Feb 2021
    1. first considered how we could write a piece for the public later

      This is an interesting suggestion; begin by planning how you will communicate your idea to the public.

    2. the first step in improving academic writing is to learn to reduce the jargon academics use and express concepts clearly

      See Pinker, S. (2015). The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century (Reprint Edition). Penguin Books.

      Also, Thomas, F.-N., & Turner, M. (2011). Clear and Simple as the Truth: Writing Classic Prose (2nd Edition). Princeton University Press.

    1. Greater reach leads to far greater exposure. This can take the form of comments from academics around the world, invitations to collaborate, and TV and radio interviews.

      Possible, but unlikely.

    1. Academics need to start playing a more prominent role in society instead of largely remaining observers who write about the world from within ivory towers and publish their findings in journals hidden behind expensive digital paywalls.

      This is another thing we need to be better at; publishing in open access journals.

    2. This can help develop creative non-fiction writing skills.

      We don't really think of ourselves as non-fiction writers.

    3. Academics have no choice but to go along with this system. Their careers and promotions depend almost entirely on their journal publication record, so why even consider engaging with the general public?

      Well, there are also good reasons to believe that blogging and other forms of interaction on social media can enhance an academic's formal research output.

    4. Universities also don’t do a great deal to encourage academics to step beyond lecture halls and laboratories. There are globally very few institutions that offer incentives to their academics to write in the popular media, appear on TV or radio, or share their research findings and opinions with the public via these platforms.

      This demonstrates a lack of understanding on the part of institutions. By being more public and engaged - through, for example, blogging and social media interaction, academics do add value to their institutions through their affiliation.

    5. Some academics insist that it’s not their job to write for the general public. They suggest that doing so would mean they’re “abandoning their mission as intellectuals”. They don’t want to feel like they’re “dumbing down” complex thinking and arguments.

      And this is why society is increasingly hostile to academics.

    6. many potentially world altering ideas are not getting into the public domain

      Alternatively, we should ask if the work being done, and never being read, is of any use at all?

      When we consider the number of duplicated studies, or studies that don't contribute anything to the broader literature, we should probably acknowledge the possibility that most published research isn't very useful.

    7. an average journal article is “read completely by no more than ten people”. They write: Up to 1.5 million peer-reviewed articles are published annually. However, many are ignored even within scientific communities – 82% of articles published in humanities [journals] are not even cited once.

      When you think about the enormous amount of time and intellectual energy that goes into the process of getting the project proposal accepted, gathering data, analysing it, writing it up, and getting it through the peer review process, this seems like an awfully big waste of time.

      How is this good for anyone?

    8. Research and creative thinking can change the world. This means that academics have enormous power. But, as academics Asit Biswas and Julian Kirchherr have warned, the overwhelming majority are not shaping today’s public debates.

      I have a real concern that my "value" to my institution lies in how many other academics cite my work. It's like we all live in a bubble where we're just talking to each other.

      Surely it matters more if my work is useful to the much larger public?

    1. forced

      How about "...given the opportunity..."?

    2. By forcing students to write ‘publicly’, their writing rapidly improves

      I don't love the phrasing here: if this is "forcing" then everything we ask our students to do is "forced", in the sense that it is a curriculum requirement. It doesn't fit with building a teaching-learning relationship, or of learning being student-centred.

    3. there is no waste – what starts as a blog, ends as an academic output

      You could also take the position that the blog post is itself an academic output, albeit one that the academy doesn't (yet) formally recognise.

    4. By building blogging, Twitter, flickr, and shared libraries in Zotero, in to our research programmes – into the way we work anyway – we both get more research done, and build a community of engaged readers for the work itself.

      This is linked to the concept of an open scholar: Burton, G. (2009, August 11). The Open Scholar. Academic Evolution. https://www.academicevolution.com/2009/08/the-open-scholar.html

      The open scholar is someone who makes their intellectual projects and processes digitally visible and who invites and encourages ongoing criticism of their work and secondary uses of any or all parts of it–at any stage of its development.

    5. when journal articles proliferate beyond number because they serve the needs of big publishing, rather than academic dialogue – we need to think harder about how we do the job of the humanities

      This also serves the academy though, who are remunerated for the publications of their employees. While citation "counts" for the individual academic, it's number of publications that "counts" for the institution.

    1. Don't bend yourself to fit the world.

      The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man. - George Bernard Shaw

    2. Don't bend yourself to fit the world.

      The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man. - George Bernard Shaw

    1. most students did not report study strategies that correlated with their VARK assessment, and that student performance in anatomy was not correlated with their score in any VARK categories. Rather, some specific study strategies (irrespective of VARK results), such as use of the virtual microscope, were found to be positively correlated with final class grade. However, the alignment of these study strategies with VARK results had no correlation with anatomy course outcomes. Thus, this research provides further evidence that the conventional wisdom about learning styles should be rejected by educators and students alike.

      It's unusual that researchers will make such definitive claims about the outcome of a study.

    1. If we write blogs, we are told, we can communicate our research more effectively. Blogs enhance impact, they are a medium for public engagement. The advocacy goes on… Blogs (and other social media) can point readers to our (real) academic publications, particularly if they are held on open repositories. Blogging it seems is a kind of essential add-on to the usual academic writing and academic publication that we do.

      i.e. we can use blogs to point readers to our "real" academic work.

    2. Blogging helps you to get to the point The blog post is a small text, not an extended essay.  It’s simply not possible to introduce lots and lots of different ideas and make multiple points in a post of a thousand words or less.

      Branchaud, J. (2020, February 27). Write More, Write Small. DEV Community.

    3. Of course, some people do argue – and I’m in this camp – that blogging is in and of itself academic writing and academic publication. It’s not an add-on. It’s now part and parcel of the academic writing landscape.  As such, it is of no less value than any other form of writing. Even though audit regimes do not count blogs – yet – this does not lessen its value. And therefore those of us who engage in bloggery need to stop justifying it as a necessary accompaniment to the Real Work  of Serious Academic Writing. Blogs are their own worthwhile thing.

      i.e. blogs are the academic work.

    1. How do you find time to write? I’ve become fascinated by this question in recent months. Implicit within it is an understanding of ‘writing’ which I’m coming to see as deeply problematic. It treats the creative activity of writing as a matter of temporal budgeting. But how much time does writing take? It obviously depends on what we mean by ‘writing’

      See Golash-Boza, T. (2010, September 4). Get a Life, PhD: Ten ways you can write every day. Get a Life, PhD for 10 examples of what "writing" might include.

    2. perhaps it’s getting into the routine of responding to ideas in this way as and when you encounter them.

      Write when the idea arrives instead of simply making a note of it.

      This happens to me all the time. An idea arrives and I'm excited by it. Maybe I'm busy cooking supper or I'm in the shower, so I can't immediately write it down. But I spend 15 minutes exploring it. It's still exciting.

      I finish what I'm doing and write a paragraph to keep track of the idea.

      But when I return to it in a few days it seems a pale replica of the original idea. Less relevant and interesting.

      And I invariably end up deleting the paragraph.

    1. By writing regularly, and for shorter periods (2-3 hours a day),

      This is my ideal; I try to write from 8-10 every morning, Mon-Fri.

    2. One study suggests that academics who write daily and set goals with someone weekly write nearly ten times as many pages as those without regular writing habits.

      See Silvia, P. J. (2018). How to Write a Lot: A Practical Guide to Productive Academic Writing (Second Edition). APA LifeTools.

    3. An important part of being an academic researcher is remembering that you are an author.

      I don't think that many academics think of themselves as authors.

    1. This left branching sentence forces the brain to ‘hold’ a lot of information about what the academic managers are doing before applying it to the action. It’s the kind of sentence that forces the reader to go back to the start after they have finished in order to really understand what is going on.

      See Pinker, S. (2015). The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century (Reprint Edition). Penguin Books for a more detailed discussion of these points.

    1. academic colleagues find out I’m blogging?

      Then they'll be jealous.

    2. I try to keep a note of what I read, which I probably would not do if I was not writing a blog

      I've recently shifted into a frame of mind where I think that, if I'm reading something (something that isn't obviously news or entertainment), I should be making notes. If I'm not making notes, then I'm probably wasting my time reading that particular piece of content.

    3. admitting and correcting mistakes does you no harm

      If anything, it's vital. Journals aren't going to actively seek this out.

    4. But I should be doing research, or reading papers, rather than writing blogs. The main activity that blogging has displaced for me is watching TV.

      It's true that blogging shouldn't take a lot of time. It also shouldn't purely be something that you do in your own time. It's academic work and the benefits of blogging accrue to the institution as well through; directly through mention of the institution, but also indirectly in additional skills and networks developed by the blogging academic.

      Having said that, it does take time, which needs to fit in somewhere.

    5. jargon comes so naturally

      See Pinker, S. (2014, September 26). Why Academics Stink at Writing. Chronicle of Higher Education, 16.

    6. writing about contentious issues like austerity it is perhaps too easy to be rude

      I think this is good advice; you can get to a place where you forget that others are reading what you write. And tone, especially for people who don't know you in person, can be hard to convey in your writing.

    7. I do not fancy getting into online debating contests

      No-one does. In more than 10 years of blogging I've never had this happen. Of course, you can move into a space where it's more likely to happen, but I think that would need to be a choice you're making.

    8. I thought my posts would mainly be a useful resource for my students. Things I did not have time to say or elaborate on in lectures

      Blog posts as an addendum to your lectures.

    1. Rather than reducing scholarship to blogging

      This is a straw man; I don't believe that any academics sharing their ideas on blogs had/have any concerns that their scholarship is being "reduced" to anything.

    2. Far from subjugating research to journalism

      Is/was anyone actually saying this? Perhaps an example would give this some credibility.

    3. I share many of the fundamental concerns which I hear expressed about impact and public engagement – particularly the entirely justified fear that this agenda, as well as the broader changes within higher education within which it is unavoidably implicated, threaten the autonomy of academic work. I think there’s a risk that the production of academic knowledge (in the broadest sense of the term) becomes subjugated to the contingencies of the political cycle, particularly as its mediated by funding bodies and other intermediaries.

      Is the author really saying that academics sharing their work as part of a contribution to the public discourse is problematic? And this this threatens academic autonomy? Or am I misunderstanding what's being said? I can't imagine how an academic who shares their work in a more accessible format, is putting their autonomy at risk.

    4. space between academic research and journalism

      Maybe it would be useful to explain what the author means by "journalism"? Because if it's "writing for the mainstream media e.g. the New York Times, etc., then academics have been journalists for a long time". Maybe I'm missing something but I'd have liked for the author to go a bit deeper into these two spaces and the continuum between them.

    5. lead many, when confronted with the advocation of academic blogging, to see ‘blogging’ as corrupting ‘academic’

      This hasn't aged well.

    6. does academic blogging dangerously blur the boundary between research and journalism?

      I have no idea why this is "dangerous", nor why this blurring is problematic.

    1. Find the Most Creative People in Your Field and Steal From Them

      Kleon, A. (2012). Steal Like an Artist: 10 Things Nobody Told You About Being Creative (Illustrated edition). Workman Publishing. See Brainpickings article on the book.

    2. There’s almost a direct correlation between how much someone created and how original their work ended up being.

      If you want to have good ideas, start by having lots of ideas. I think that Linus Pauling said this (or something like it).

    3. Art shares a lot of similarities with undervalued stocks.17 At first, when people hear of a novel idea, a lot of them will laugh it off as ridiculous, outlandish, unnecessary, or just plain dumb. It’s here that the artist “buys” the idea at its low value, then finds a way to refurbish and “flip it” into something of higher value that the world understands and appreciates.

      Look for ideas that are under-valued, invest resources (time, money, energy) into them, so that when their true value is appreciated, you're in a good position to get a return on the investment.

    4. Steve Jobs didn’t invent the personal computer. He didn’t invent the mouse or graphical interfaces. He didn’t invent MP3 players or smartphones. He didn’t invent tablets or laptops or wearables. He literally invented nothing. He just did old things better

      Dixon, C. (2020, October 18). Doing Old Things Better Vs. Doing Brand New Things. Andreessen Horowitz. https://a16z.com/2020/10/18/doing-old-things-better-vs-doing-brand-new-things/

    5. Focus on Doing the Work, Not Flashes of Inspiration

      Johnson, S. (2011). Where Good Ideas Come from: The Seven Patterns of Innovation. Penguin Books.

      Syed, M. (2015). Black box thinking: Why most people never learn from their mistakes--but some do.

      Pressfield, S., & Godin, S. (2015). Do the Work: Overcome Resistance and Get Out of Your Own Way. Black Irish Entertainment LLC.

      Some great books on creativity and doing the hard work that comes after inspiration.

    6. the second part of creative work: adding value

      Try to be useful.

    1. When I’m writing on non-housing topics is that “academic blogging”? Or just an academic blogging?

      Identity again. Is your identity influenced by what you're writing about? Or does what you're writing about influence your identity?

    2. Some academic bloggers leaven the mix by interspersing their ‘academic’ posts with more personal posts about family, biography or travel. I’m not at all averse to that approach, but it isn’t really my style.

      I also find it challenging to share personal information on my blog.

    3. And if the world is going to grasp what’s happening then our writing needs to be digestible.

      You need to use different language when writing on your blog, compared to writing papers. You don't need references. You should write in first person. Spell checking is optional.

    4. An academic blogger may feel constrained to topics only related to his or her academic research, whereas a blogger who is also an academic is free to explore wider fields of discussion.

      This idea of "identity" is important. Many academics don't even think of themselves as authors let alone bloggers.

    1. To achieve a position in the top tier of wealth, power and privilege, in short, it helps enormously to start there. “American meritocracy,” the Yale law professor Daniel Markovits argues, has “become precisely what it was invented to combat: a mechanism for the dynastic transmission of wealth and privilege across generations.”

      Really good interview with Markovits and Sam Harris on the topic on meritocracy.

    1. the best solution for creative blocks isn’t to try to think in front of an empty page and simply wait for thoughts to arrive, but actually to continue to speak and write (anything), trusting this generative process

      I believe that this is something that I experience fairly regularly (but I have no control group, so it's hard to be certain); I write and read and write and read and creative ideas come. I don't go looking for them and I don't wait for them to arrive.

    2. It’s not thought that produces speech but, rather, speech is a creative process that in turn generates thought

      This is a bit like Feynman's suggestion that teaching a concept to someone else an excellent way for you to learn it yourself.

    3. Speaking out loud is not only a medium of communication, but a technology of thinking: it encourages the formation and processing of thoughts.

      Talking out loud is way to develop their thinking. You don't speak fully developed thoughts...you develop your thoughts while speaking.

      See also Matuschak, A., & Nielsen, M. (2019). How can we develop transformative tools for thought? and Victor, B. (2014, December 22). The Humane Representation of Thought.

    1. 3.) Who can cite blogs? Okay, now here comes the real hypocrisy. Although I cite blogs within academic writing, I explicitly forbid my undergraduate students from doing so. Their papers must include only peer-reviewed work unless I specifically approve of a non-peer-reviewed source. Oh, hi Privilege, nice to see you again. The key difference between my students and me (besides, of course, our taste in music and repertoire of Seinfeld quotes), is that I have a Ph.D. and they are working on Bachelor’s degrees. That is, we are differentiated by levels of education, and having a higher level of education gives me the privilege and power to determine the value of piece of writing, and denies this power and privilege to those with less formal education. To say it out loud feels like the academic equivalent of “Because I Said So.” At the same time, I have been trained in a particular field for several years. I have read the jargon-ridden journal articles, trudged through the 5-chapters-too-long books, and even contributed a few pieces of my own. Moreover, I have been a peer-reviewer, charged with making formal decisions about what is, and is not, a publishable piece of research. And so I take this training and I use it, again imperfectly, as a privilege, allowing myself to discern quality while urging others to wait until they have enough knowledge and practice to make such discernments. What “enough” is, however, remains quite nebulous.

      I agree with this general argument. Again, not all opinions are equal.

    2. one can counter the former point by noting the poor quality of some published articles, problematizing the false-security that comes along with a legitimizing label of “peer-reviewed.”

      Peer review is, in itself, not a guarantee of quality.

    3. What would such an anything-goes literature review look like?

      What indeed? Why not go a bit further and explore what, in fact, this kind of review might look like?

    4. More ambiguous, of course, is the question of using content from these blogs, in their own right, as building blocks or even a foundation for, theoretical arguments.

      Blog content can be used as "data" as part of an analysis, but we should be more careful when using the same content to develop theoretical arguments. I guess because we have to be more cautious when making knowledge claims about the world.

    5. Some of their work is published only in blog form

      Blog posts can be great for pushing the boundaries of thinking and for presenting arguments that fall outside the scope of traditional academic practice.

    6. Should writers cite blog posts in formal academic writing (i.e. journal articles and books)?

      I suppose it depends on who you cite? Not all opinions are equal.

    1. AI agents can acquire novel behaviors as they interact with the world around them and with other agents. The behaviors learned from such interactions are virtually impossible to predict, and even when solutions can be described mathematically, they can be “so lengthy and complex as to be indecipherable,” according to the paper.

      The sheer number of interacting variables that you'd need to track makes it impossible to make any accurate predictions.

    2. it might be killing three times the number of cyclists over a million rides than another model

      Fair enough. But then we should also make the counter-argument...how many motorists did the self-driving car save in the same period?

      I know that this is a tricky ethical scenario and I'm not trivialising it, but these arguments are overly simplistic and one-sided.

    3. Say, for instance, a hypothetical self-driving car is sold as being the safest on the market. One of the factors that makes it safer is that it “knows” when a big truck pulls up along its left side and automatically moves itself three inches to the right while still remaining in its own lane. But what if a cyclist or motorcycle happens to be pulling up on the right at the same time and is thus killed because of this safety feature?

      I think that an algorithm that's "smart" enough to move away from a truck is also "smart" enough to know that it cannot physically occupy the same space as the motorcycle.

    4. A co-author of the paper, Alan Mislove of Northeastern University, is among a group of academic and media plaintiffs in a lawsuit challenging the constitutionality of a provision of the Computer Fraud and Abuse Act that makes it criminal to conduct research with the goal of determining whether algorithms produce illegal discrimination in areas like housing and employment.

      So you're a criminal if you're doing research to determine if an algorithm is doing something criminal? That seems...wrong.

    1. We are all one giant human-machine system,” says Obradovich. “We need to acknowledge that and start studying it that way.

      A socio-technical system.

    2. move away from viewing AI systems as passive tools that can be assessed purely through their technical architecture, performance, and capabilities. They should instead be considered as active actors that change and influence their environments and the people and machines around them.

      Agents don't have free will but they are influenced by their surroundings, making it hard to predict how they will respond, especially in real-world contexts where interactions are complex and can't be controlled.

    3. propose to create a new academic discipline called “machine behavior.” It approaches studying AI systems in the same way we’ve always studied animals and humans: through empirical observation and experimentation

      We do this all the time; observe people's behaviour and then make inferences about their intentions.

    4. As algorithms have come to mediate everything from our social and cultural to economic and political interactions, computer scientists have attempted to respond to rising demands for their explainability by developing technical methods to understand their behaviors.

      It's completely bizarre that we have such a high standard for trusting the predictions of algorithms when, up until recently, we trusted human beings to "mediate everything from our social and cultural to economic and political interactions" and had absolutely zero expectation that we needed to understand the reasoning processes in those human beings.

    5. We've developed scientific methods to study black boxes for hundreds of years now, but these methods have primarily been applied to [living beings] up to this point

      It's called psychology.

    1. Koo's discovery makes it possible to peek inside the black box and identify some key features that lead to the computer's decision-making process.

      Moving towards "explainable AI".

    2. Neural nets learn and make decisions independently of their human programmers. Researchers refer to this hidden process as a "black box." It is hard to trust the machine's outputs if we don't know what is happening in the box.

      Counter-argument: Why do we trust a human being's decisions if we don't know what is happening inside their brain? Yes, we can question the human being but we then have to trust that what they tell us about their rationale is true.

    1. You see it in education. We have top-end universities, yes, but with the capacity to teach only a microscopic percentage of the 4 million new 18 year olds in the U.S. each year, or the 120 million new 18 year olds in the world each year. Why not educate every 18 year old? Isn’t that the most important thing we can possibly do? Why not build a far larger number of universities, or scale the ones we have way up?

      Higher education is still an elite institution, for the elite. And despite all the rhetoric about opening up access to it, the fundamental structure of universities prevents it. We can't scale learning.

    2. A government that collects money from all its citizens and businesses each year has never built a system to distribute money to us when it’s needed most.

      Implementing a universal basic income would force governments to build this system where money can flow to citizens.

    1. If part of your job is to deal with a large amount of incoming, you actually need to respond in a timely manner and not let people down

      A huge part of this workflow is managing the incoming information so you're not overwhelmed by it. So you need systems for that.

    2. In your old post you talked about Arnold Schwarzenegger’s open calendar and the upside of having unstructured time in your day and the flexibility you get with that. When Arnold did that interview I think he was in “entrepreneur mode”. At the time he was engaged in lots of entrepreneurial projects and starting lots of new businesses.

      See Maker and manager scheduling (http://paulgraham.com/makersschedule.html).

    3. t’s more by week than by day. The day of the week determines a lot. Monday and Friday have very specific schedules because we run in the rhythm of a venture capital firm

      Time batching and day theming.

  3. Jan 2021
    1. “It’s worth keeping in mind,” he says, “that revving the creative engine to fire at higher speeds . . . means more ideas and more experiments, which also means, inevitably, more failed experiments.”

      Before you have can a good idea you need to have lots of ideas.

    2. Most accidents never end up being profitable or valuable in a measurable way. But they’re necessary because they’re part of the process of developing something new. Accidents fuel creativity.

      In the same way that most random mutations in DNA have no effect, or detrimental effects, every now and again a random mutation increases survival fitness.

    1. patients clearly feel that the process of telemedicine (logistical things like ease of scheduling and making audio/video connections) falls short: while 89% of patients would recommend their provider after having had a telemedicine visit, only 76% of patients would recommend a video visit following a telemedicine visit.

      While F2F interactions may be coloured primarily by the quality of your care, or the cleanliness of your practice, or the helpfulness of your staff, online interactions may be influenced by things completely out of your control e.g. the patient's internet connection, or the quality of the image from their webcam.

    2. attitudes moving from, This provider must not think my problem is important since they are seeing me via telehealth, to This provider cares about me and therefore is seeing me via telehealth.”

      The patient's beliefs around telehealth are going to inform their perceptions of the clinician's level of care.

    3. Among the reasons the telehealth connection seems to resonate with patients is that providers can actually seem more attentive on-screen. One patient commented that while her doctor always seemed distracted by a computer screen during in-person visits, during video visits the doctor looked directly at her.

      Or, more accurately, the doctor was looking at his screen. His email client may have been open and positioned over the patient's face, and it would appear to the patient that he was very attentive.

    1. As new models of care emerge, decision-makers will need to determine where to draw the line between the need for in-person visits versus telecare visits. Hybrid models that combine in-person visits with telecare could improve the efficiency of care and increase patient engagement and convenience while maintaining aspects of the physician-patient relationship for which in-person visits are essential. Providers will also need to pilot different models of timing and length of appointments with telemedicine. Compared to infrequent in-person visits of longer duration, telecare can allow for shorter regular visits to, for example, monitor the impact of a new medication change or to assess prognosis following a hospitalization.

      How will clinicians change their core practices as a result of the changes introduced by telehealth? Will we have the critical awareness to challenge our taken-for-granted assumptions about beliefs that are fundamental to what we do, but which are little more than tradition and legacy.

    2. a thorough onboarding process is more likely to lead to successful adoption of new technologies by older adults

      Will GPs have to develop onboarding processes for new patients? Will these be standardised?

    3. clinics have developed a protocol that requires an initial inventory of what technology is available to each patient, what device they find easiest to use (older adults frequently prefer tablets over computers or smartphones), and what teleconferencing platforms, if any, they are familiar with.

      Are we preparing clinicians to help them prepare patients? What assumptions are we making about patients' levels of digital literacy? Device ownership? Connectivity?

    4. The model should be tailored to each individual based on what technologies they have access to and proficiency with.

      Alternative approaches see software developers looking to build all-in-one services that require patients and clinicians to learn how to navigate them. By adapting to meet the patient where they're most comfortable, clinicians are likely to have better client satisfaction but may themselves need to adapt.

    1. he told me that what he really liked was solving problems. To me the exercises at the end of each chapter in a math textbook represent work, or at best a way to reinforce what you learned in that chapter. To him the problems were the reward. The text of each chapter was just some advice about solving them.

      Changing your mindset can sometimes change everything.

    1. Revisit your favorite readings, videos, or discussions from your field.

      You'll often find deeper wisdom and further opportunities for learning when you come back to the classics. Important work is important because it has depth.

    2. Approach your work without expectations of the result.

      Love the process, not the product.

    3. Branch out into adjacent fields for inspiration and new perspectives.

      This is the goal of keeping a digital garden.

    4. Maintain a ‘Beginner’s Mind.’ The concept of “Shoshin” or “Beginner’s Mind” originates from Zen Buddhism. It asks people to practice going through life with a sense of openness and to avoid being jaded with expectations and preconceived notions. Or as Steve Jobs put it: “stay hungry, stay foolish”.

      Hungry in your pursuit of changing the world. Foolish enough to believe it's possible.

    5. If you’re doing it correctly, you’ll often feel like a beginner over and over again

      Waitzkin: "beginner's mind".

    6. Rather than moving in a straight line, you’ll experience a rapid spurt of skill development followed by lulls of stagnation or even regression.These plateaus can last weeks, months or even years. As we watch peers brush past us and feel our own passion fade away, it can be tempting to give up on our deliberate practice and forgo the path of mastery all together.

      Recognise that your goal of achieving mastery in a practice isn't some kind of race against others; it's not even a race against yourself. It's not a race at all.

    7. Ignore DistractionThe distractions of everyday work and life have a way of whittling hours from the little time you do have. Waitzkin bemoans a culture that emphasizes distraction over focus:“Our obstacle is that we live in an attention-deficit culture. We are bombarded with more and more information on television, radio, cell phones, video games, the Internet. The constant supply of stimulus has the potential to turn us into addicts, always hungering for something new and prefabricated to keep us entertained.”

      Avoid social media.

    8. Ascending fields often means working closely with collaborators. You’ll curtail your success if you can’t effectively work well with others.

      You can't achieve mastery on your own. You need to share and collaborate with others.

    9. Our greatest power over resistance is knowing of its existence and the many forms it can take: complacency, stagnation, distraction, emotional turmoil, rejection, and burnout

      All the reasons that you avoid "doing the work" (Pressfield, Do the work).

    10. Failure is uncomfortable and makes us feel weak and powerless. It brings us back to feeling like a novice, erasing all the hard work we’ve done.

      I think that Waitzkin refers to this as "beginner's mind".

    11. Greene describes this willingness to put ourselves out there as the “Experimentation” or “The Active Mode”. He suggests the following:“Expose your work to the public for active feedback. If you wait until you’re ready, you’ll never be ready.”

      A bit like what we're trying to do with In Beta.

    12. A senior peer you admire may bring up a key error you’ve made, leaving you embarrassed. On the path to mastery, get comfortable with this feeling of failure.

      Don't only get comfortable with it when it happens; seek it out.

    13. setting goals and milestones can spur your progress forward

      Set arbitrary goals to push you to make progress in your practice. It's too easy to become complacent and convince yourself that you'll do it tomorrow.

    14. “A competitor needs to be process-oriented, always looking for stronger opponents to spur growth, but it is also important to keep on winning enough to maintain confidence.”

      You need to find learning opportunities that allow you to practice just outside of your comfort zone.

    15. However, most performance experts say real-life mentorship is the gold standard. Leonard echoes this sentiment:“Instruction comes in many forms. For mastering most skills, there’s nothing better than being in the hands of a master teacher, either one-to-one or in a small group. But there are also books, films, tapes, computer learning programs, computerized simulators (flight simulators, for example), group instruction, the classroom, knowledgeable friends, counselors, business associates, even “the street.” Still, the individual teacher or coach can serve as a standard for all forms of instruction, the first and brightest beacon on the journey of mastery.”

      How do you create a learning community?

    16. Often the people whose skills you want to acquire are a click away, sharing their thoughts in an interview or tweeting about the early failures they faced in their career.

      I often wonder if it's worth spending time blogging or sharing my thoughts anywhere; of what value is it? But maybe by sharing I'm answering questions that other people have.

    17. erform self-assessments. Something that separates amateurs from masters is their insistence on regular self-assessment.

      You need to keep writing metrics, if writing is the thing you want to get better at.

    18. There’s comfort in practicing what we’re good at, like using the same narrative arc while writing a short story or studying the scientific theories we’re already familiar with. It’s only when we expose ourselves to the uncomfortable sensation of feeling stupid or incompetent that we grow.

      Coming back to this idea that you have to know your weaknesses in order to improve. It's only by working on areas that you find challenging, that you grow.

    19. Greene encourages us to embrace the slog:“Second, the initial stages of learning a skill invariably involve tedium. Yet rather than avoiding this inevitable tedium, you must accept and embrace it. The pain and boredom we experience in the initial stage of learning a skill toughens our minds, much like physical exercise. Too many people believe that everything must be pleasurable in life, which makes them constantly search for distractions and short-circuits the learning process.”

      You have to love the process. Everyone loves the product but if you don't love the process you'll never produce anything that anyone cares about.

    20. By making constant and non-negotiable space in your schedule for deliberate practice, you can continue along the path of mastery even when other priorities arise.

      Like setting aside the first hour (or two) of each day to focus on your practice.

    21. Deliberate practice is focused and forces you to direct your attention, again and again, towards improving your weaknesses.

      Not to be confused with the earlier point about ignoring your weaknesses. That point was about avoiding spending time on things that you don't enjoy or that aren't part of your mastery journey. This point is about improving specific aspects of the practice you're looking to master.

    22. Greene’s advice on making your deliberate practice valuable:“The key, then, to attaining this higher level of intelligence is to make our years of study qualitatively rich. We don’t simply absorb information — we internalize it and make it our own by finding some way to put this knowledge to practical use.”

      You need to find (or create) opportunities to put into practice what you've learned. Knowledge that moves you towards mastery must be useful.

    23. you’re the average of the five people you spend the most time with. Seek out people you can learn from. Welcome the opportunity of being the dumbest person in your circle of acquaintances

      If you're the smartest person in the room then you may be in the wrong room.

    24. Greene describes the importance of this mindset on the path to mastery:“…value learning above everything else. This will lead you to all of the right choices. You will opt for the situation that will give you the most opportunities to learn, particularly with hands-on work.”

      It might also be worth spending a lot of time learning how to learn.

    25. Your syllabus should be bottomless. Always add more to it. One insightful book may reference another that you need to devour. An artist who inspires you may cite their influences, leading you down another rabbit hole.

      Avoid looking for the quick and easily digestible content on blogs and social media. Find the difficult works that will form a natural barrier to others pursuing the same goal. If you can master hard things, it makes it that much harder for others to follow in your path.

    26. If you want to go into advertising, study David Ogilvy. If you want to study design, ingest the work of Dieter Rams. To understand nature writing, read Rachel Carson. Build a foundation of knowledge that includes the classics of your field.

      This also helps you to learn what has already been done. There's wisdom in ideas that are still around; there's a reason that people still refer to Seneca.

    27. Greene provides this poignant description of how our dreams may diminish as we move from adolescence to adulthood:“You possess a kind of inner force that seeks to guide you toward your Life’s Task—what you are meant to accomplish in the time that you have to live. In childhood this force was clear to you. It directed you toward activities and subjects that fit your natural inclinations, that sparked a curiosity that was deep and primal. In the intervening years, the force tends to fade in and out as you listen more to parents and peers, to the daily anxieties that wear away at you. This can be the source of your unhappiness—your lack of connection to who you are and what makes you unique. The first move toward mastery is always inward—learning who you really are and reconnecting with that innate force. Knowing it with clarity, you will find your way to the proper career path and everything else will fall into place.”Reflecting on what we loved to do as children can be a powerful exercise that brings us closer to the master’s path.

      It's something you feel compelled to do.

    28. Life’s Task

      What is my life's task?

    29. Ten thousand hours is a non-trivial amount of your life to set towards a singular focus

      No matter what the specific timeframe is, you need to recognise that this is going to take both time and effort. You'll probably have to give something up to make space in your week.

    30. given your weekly schedule

      If something is important to you, and getting better is at least partly a function of the time you spend getting better, it follows that you need to carve out more time for it. So rather than looking at your schedule to see where you fit your practice in, start by asking how much practice you want or need, and then reworking your schedule around that.

    31. Greene suggests that achieving mastery might take around 10,000 hours. Malcom Gladwell echoes this in his book, Outliers, citing studies and examples of world class performers that indicate 10,000 hours is the “magic number of greatness”. The precise number of hours required to reach mastery is a cause of controversy, and other performance experts have noted that it’s the quality of practice that’s important, not the amount. It’s also important to note that not all hours are equal. Hours 1-100 will be much less effective than hours 8000-8100.

      It might be more useful to focus on concrete improvements in specific areas of practice, rather than on the number of hours.

      It should suffice to know that achieving mastery will probably take decades and not months or year.

    32. Mastery often means toiling for years in obscurity, learning the basics of your field, and slowly making the transition from novice to master.

      And if you spend this time constantly looking for external reward, it's likely going to get tedious quickly.

    33. If you stay the path, you’ll likely work with brilliant people, have enriching conversations, and produce acclaimed work. View these rewards as a byproduct of mastery, not its final form.

      Over time your pursuit of mastery becomes it's own reward.

    34. While mastery often leads to professional success and money, pursuing mastery for those reasons is self-defeating

      Internal goals and motivation are more sustainable in the long-term.

    35. Greene summarizes the sensation of mastery as “the feeling that we have a greater command of reality, other people, and ourselves.” He adds that a master at work often experiences a “feeling of power”, “exceptional creativity”, and a “sense of control”. Experiencing mastery is turning away from distraction and getting lost in the feeling of sharpening our skills, honing our intuition, and applying our knowledge to our work.

      There's a qualitative component to the feeling of having mastered a skill or process.

    36. flow

      Csikszentmihalyi, M. (2008). Flow: The Psychology of Optimal Experience (1st edition). Harper Perennial Modern Classics.

    37. “Mastery is not about perfection. It’s about a process, a journey. The master is the one who stays on the path day after day, year after year. The master is the one who is willing to try, and fail, and try again, for as long as he or she lives.”

      You're never finished. You can always improve.

    38. “Mastery is not a function of genius or talent. It is a function of time and intense focus applied to a particular field of knowledge.”

      Process over product.

    39. That’s not to say that genetics isn’t real or important — the proclivities and traits you’re born with are a natural starting point for mastery

      Someone who is short simply isn't going to be a professional basketball player. Obviously anyone can enjoy - and be good at - basketball; but only tall people master it.

    40. It’s not a synonym for genius or giftedness. Those kinds of words suggest that greatness lies beyond our control.

      Mastery is a set of skills that you can get better at.

    41. mastery is the years and decades of learning, practice, failure, and hard-fought improvement that lead individuals toward unmatched greatness

      We tend to label something as "masterful" when we see it, but we only see the product and not the process. But it's the process that matters.

    1. Introduce students to the “explode to explain” strategy. When students “explode to explain,” they closely read a key sentence or two in a source, annotate, and practice explaining what they are thinking and learning.

      This is a specific strategy to include in an active reading session.

    1. But if something opens the drawer and takes out a block and says, “I just opened a drawer and took out a block,” it’s hard to say it doesn’t understand what it’s doing.

      This depends entirely on how you define "understand". Is there a difference between something behaving as if it understands, and actually understanding?

    1. unlike a traditional computer, a blockchain computer can offer strong trust guarantees, rooted in the mathematical and game-theoretic properties of the system. A user or developer can trust that a piece of code running on a blockchain computer will continue to behave as designed, even if individual participants in the network change their motivations or try to subvert the system. This means that the control of a blockchain computer can be placed in the hands of a community
    1. They have backed the Australian government into a corner, and now the government has no option but to reject their bullying threats and move forward with the code, or surrender our democratic processes to big tech

      I'm not sure why there's a need for all the bluster. Google can leave the country and Australia will be just fine.

    2. they ultimately put their commercial interests ahead of the democratic processes of the nations they operate in

      This seems weird. Google has a fiduciary responsibility to make money for their shareholders. Yes, they should put their commercial interest first.

      Because it's a bullshit argument to say that it's "profit" OR "democracy". This simply isn't true. Google is not synonymous with democracy and to say that it is is not helping the argument.

    3. Is the company willing to blockade the entire Google ecosystem to spite Australia?

      Let them. Then Australians can show the world that life goes on without Google. They can demonstrate just how effective the alternatives are. Then people will voluntarily stop using Google. Then Google changes it's business model or goes away.

    4. What would become of other services like Google Maps, Google Docs and Gmail

      There are plenty of mapping, collaborative writing, and email options available.

    5. There is no doubt that the implications of pulling Google Search in Australia would be huge

      How so? Millions of well-functioning people in first-world, high-functioning democracies get by just fine without Google search.

    6. with Google having a monthly audience of 19 million and Facebook 17 million in Australia alone

      What happens if Google did pull out of Australia? And then imagine the UK adopted a similar law, and Google pulled out of the UK. Then other EU countries. At what point does Google start losing enough money because no-one is looking at ads on their services? Google exists to display ads, which they can only do when people use their products. I'd love to see someone call this bluff.

    7. For most Australians, Google and Facebook are the internet, or at the very least, the key gateway to it

      If Google and Facebook pulled out of Australia, I imagine it would be like a soothing balm. People would learn how many other useful and interesting tools are available for surfacing news.

    8. content is accessible to everyone, distribution is fair and democratic, and no one platform or provider has unfair advantage

      It seems weird that these two are saying that Google and Facebook are somehow bulwarks against the tide of single platforms dominating everything...wait, what?

      Again, Google and Facebook surface content that is created elsewhere. We can get the content without Google and Facebook. DuckDuckGo (among others) is awesome. And the filter bubble of Facebook is partly responsible for the insanity that's been unfolding in America.

    9. Tim Berners-Lee (the inventor of the web) and Vint Cerf (the “father of the internet” turned Google executive), both made submissions to the Senate inquiry

      This makes me sad. I thought these two were better than this.

    10. disinformation online

      Absolutely. And this was accelerated and supported by Google and Facebook. Misinformation flows faster because of these companies.

    11. This is happening in an environment where credible news and public interest journalism is more important than ever - providing accurate and timely information during crises like last summer’s Australian bushfires and throughout the public health challenges of the global pandemic

      This is absolutely true. It is also true that none of these things requires Google or Facebook.

    12. Facebook too reinforced earlier threats to prevent Australian users sharing news

      Because there's no other way that news could be shared, right?

    13. bombshell

      Again, melodrama. Is The Guardian also guilty of sensationalising news?

    14. social media

      Is Google a social media platform?

    15. catastrophic impact on the news media

      It's a bit more nuanced. We're the ones who want free stuff, which drives the advertising model, which incentivises clicks. Google is the dealer but we're the addicts.

    16. chilling to anyone who cares about democracy

      This is a bit melodramatic. This is a search engine we're talking about. Actually, it's an advertising and marketing company, who also does search.

    1. Keep your identity small

      Blog post by Paul Graham (http://www.paulgraham.com/identity.html)

    2. Learn keyboard shortcuts. They’re easy to learn and you’ll get tasks done faster and easier

      I've recently started doing this and it's true.

    3. How you spend every day is how you spend your life
    4. Remember that you are dying
    1. To solve this problem, the mind creates maps of reality in order to understand it, because the only way we can process the complexity of reality is through abstraction

      We have to reduce complexity to concepts we can work with.

    1. Despite some implementation challenges, patient portals have allowed millions of patients to access to their medical records, read physicians’ notes, message providers, and contribute valuable information and corrections.

      I wonder if patients have edit - or at least, flag - information in their record?

    1. Leadership is not necessarily coming up with all the answers

      Not only does this mean that you have to work with your team but it also takes the pressure off of you to always come up with the solutions to everyone else's problems. By all means, give input if you're asked but don't feel pressured to provide answers to all questions.

      When you answer a question, people will feel like the decision has been made and further discussion will be closed off.

    1. Help is coming in the form of specialized AI processors that can execute computations more efficiently and optimization techniques, such as model compression and cross-compilation, that reduce the number of computations needed. But it’s not clear what the shape of the efficiency curve will look like. In many problem domains, exponentially more processing and data are needed to get incrementally more accuracy. This means – as we’ve noted before – that model complexity is growing at an incredible rate, and it’s unlikely processors will be able to keep up. Moore’s Law is not enough. (For example, the compute resources required to train state-of-the-art AI models has grown over 300,000x since 2012, while the transistor count of NVIDIA GPUs has grown only ~4x!) Distributed computing is a compelling solution to this problem, but it primarily addresses speed – not cost.
    2. artificial intelligence seems to be the future of software

      Is this because AI will write the software? At some point the programmes (and data they need) will be too complex for human beings to understand.

    1. Ideas cause ideas and help evolve new ideas. They interact with each other and with other mental forces in the same brain, in neighboring brains, and thanks to global communication, in far distant, foreign brains.

      Steven Pinker said that writing is a way for one mind to cause ideas to happen in other minds.

    1. the commonplace book has been particularly beloved by poets, whose business is the revelation of wholeness through the fragmentary

      Gestalt: the whole is greater than the sum of the parts. See also, emergence in chaos theory and complexity.

    2. Long before there was the Internet, there was the commonplace book — a creative and intellectual ledger of fragmentary inspirations, which a writer would collect from other books and copy into a notebook, often alongside his or her reflections and riffs. These borrowed ideas are in dialogue with the writer’s own imagination and foment it into original thinking. Over long enough a period of time — years, decades, often a lifetime — the commonplace book, while composed primarily of copied passages, comes to radiate the singular sensibility of its keeper: beliefs are refined, ideas incubated, intellectual fixations fleshed out, and the outlines of a personhood revealed. (Brain Pickings is, in an unshakable sense, a commonplace book.)
    1. Joy is not a function of a life free of friction and frustration, but a function of focus — an inner elevation by the fulcrum of choice. So often, it is a matter of attending to what Hermann Hesse called, as the world was about to come unworlded by its first global war, “the little joys”; so often, those are the slender threads of which we weave the lifeline that saves us.

      See Maurice Sendak's suggestion that we "sit quietly by a little stream and listen".

  4. Dec 2020
    1. there is no internet connection (Wi-Fi) in the hospital where clinical learning takes place, which was a major factor that participants reported as discouraging them from implementing learning technologie

      Why do you assume that the online component needs to be implemented while on the clinical site? What prevents students and facilitators going online after leaving the clinical site?

    2. However, the opposite was actually experienced in this study. Students reflected a lack of interest in and ability to participate in blended learning activities which seemed quite frustrating to most participan

      Because there is nothing "good" about blended learning in itself. All teaching and learning activities, if implemented poorly, have poor outcomes. Students disengage when the activity has no value (or perceived value) regardless of the medium in which the activity takes place. It has little to do with technology.

    3. purpose of online learning and digital technologies is to complement face-to-face teaching

      This is not accurate. It may be true depending on the context, but it's not inherently true.

    4. The literature has reported that blended learning approaches lead to some measure of improvement in clinical skills such as history taking and clinical reasoning

      But only if online tools are blended or integrated with what is happening in the clinical or practical context. It's not about trying to replace F2F activities with online activities.

    5. his is here, and these are the ways that you will benefit from it.

      Make sure that the discussion integrates some of the theory that was discussed in the literature review.

      The researcher does include some of these concepts in the discussion chapter.

    6. flipped classroom approach

      Is the researcher suggesting that a flipped classroom approach is a subset, or type, of blended learning? Bear in mind that there is nothing that mandates that a flipped classroom approach includes a digital component i.e you can have a flipped classroom without being online. And blended learning does suggest that there is an online component, so it's hard to see how flipped learning is related to blended learning. Maybe this needs to be explored further.

      The researcher does address this juxtaposition later in the thesis.

    7. also that they can download the app

      It has to be about more than apps.

    8. The importance of faculty development in the process of implementing learning technologies was highlighted in this section

      What problem were they trying to solve? It sounds like these participants are being asked (or perceive that they are being asked) to introduce technology for the wrong reasons, which is clearly influencing their attitude towards it.

    9. participate influence participants’ implementatio

      Missing, or extra, words in this sentence?

    10. sometimes a little bit embarrassing to admit that I can’t do it

      There's a level of vulnerability being exposed in some of these quotes, that I hope the researcher picks up on in the discussion.

    11. hat’s all that I tried, and I got such a fright that I stopped... So again, for fear of wasting time and embarrassment, rather leave it.

      It doesn't seem like anyone has suggested to these participants that they should only be using technology to try and solve problems that they are experiencing and only when the technology presents a simpler solution than alternatives.

    12. Bandura (1997)

      This is discussion and interpretation of findings, which should be in the next chapter.

    13. perceived usefulness (PU) and perceived ease of use (PEOU) (Davis

      We tend not to include citations in the Results chapter. This should be moved to the Discussion chapter.

    14. and teaching is a smaller part of it

      So are they really clinician-teachers, or are they clinicians who teach when they can?

      The researcher is not really engaging with the participant responses, seeming to accept them as is.

    15. ecause it’s always like the tenth thing that you have to squeeze in for the day

      It's always an add-on, which obviously means that it will always be prioritised accordingly.

    16. I'm a doctor, not a teacher

      Weird that you can't be both. Imagine saying that you're a parent but that you can't teach your children.

    17. and then they can come back

      It's about this regular, iterative engagement with trying to achieve one objective with an integrated approach.

    18. opinion that blended learning could be useful by including the online component of blended learning to prepare students but keep the best features of face-to-face to teach students clinical skills

      The one voice of reason.

    19. that not all technologies are able to facilitate

      And nor are they meant to. Why are these participants so confused about what blended learning is? What's the value in reporting findings that are supposed to illuminate, when these participants are so confused? The only way to resolve this is to use the responses to show how confused they are.

    20. is not suitable for a practical subject where skills and knowledge need to be demonstrated and applied

      Obviously. And no-one is suggesting that it should be done. These people are finding problems where they don't exist, simply because they misunderstand what we're talking about.

    21. There was a feeling that some learning can only be facilitated in a face-to-face setting with the student in the clinical are

      I wonder if this is because there's a misunderstanding about what blended learning is and when it might be used. This isn't about trying to force technology into a place where it doesn't work well. I'm a bit at a loss as to why this is so prevalent in the thesis. Even the researcher seems not to have grasped that these quotes from participants demonstrate a very limited understanding of what blended learning is, which makes the findings very difficult to support.

    22. because they can engage with me after the interaction and ask questions, because often after a delivery of an interaction workshop, tutorial, lecture, I have that group that comes and stands next to me and I know, okay. That’s the time that I appreciate most, because now immediately I know whether I have pitched what I said at the right level, based on the questions that I'm confronted with

      There's nothing stopping you from still doing this. Integrating technology isn't a replacement for the things that you can do well in F2F.

    23. You see, now I have to type that feedback, whereas I would prefer to have spoken that feedback to them

      It's not about what you prefer; it's about what works best for student learning.

    24. which technology cannot necessarily provide

      Exactly, which is why you don't try to have the technology replicate what's possible in F2F sessions. You use the technology to improve what F2F can't do well e.g. collaborate asynchronously, or get input from international colleagues.

    25. I only have one week, so I have to get it right [master the tool] in that one week

      No, you could prepare beforehand. And next year you wouldn't have to do the same amount of preparation because you'd already have done it.

    26. et the student to watch a video beforehand, read an article online before, and then you go to class and then you can just facilitate, they contribute a lot more

      So basically, you're saying that blended learning is a flipped classroom approach. If that's the case, then why do we have a different term for it?

    27. That’s also a form of blended learning,

      Who says?