13 Matching Annotations
  1. Last 7 days
    1. he need to align AI use with instructional goals, so automation supports rather than substitutes higher-order learning. In practice, this may involve structured training in prompt engineering, systematic verification of outputs, and reflective activities encouraging students to evaluate GenAI contributions critically.

      AI should have some limits not entirely eliminate it.

    2. Students were divided on the value of GenAI for synthesizing conclusions, acknowledging its capacity to generate new perspectives while critiquing its tendency to oversimplify and produce biased outputs [3,11].

      Some students liked GenAI for synthesizing conclusions while some others didn't.

    3. Conversely, students with higher ratings, while acknowledging GenAI’s limitations and the need for critical evaluation, appreciated its capacity to suggest novel perspectives that could serve as starting points for more detailed analysis and development (“I only used it as a base to write my own conclusion,” (Chemistry), “Useful to check that what I concluded was related to my objectives and hypotheses,” (Business Administration), and “To explain results I wasn’t expecting, and, in general, to explain what I obtained,” (Psychology)).

      Student use AI to only help them, not to do their entire work.

    4. Post-session discussions revealed that no student expressed complete trust in GenAI for bibliographic research. Some highlighted useful summaries and idea generation (“If I’ve already found some articles,

      No student trusts AI.

    5. Chan and Hu [8] observed that student reliance on ChatGPT intensifies during the preliminary stages of academic work—such as brainstorming or outlining—when immediate feedback can jump-start the creative process.

      AI is relied on more in the beginning stages of work.

    6. In contrast, other studies suggest that ChatGPT can function as a motivational scaffold, enhancing creative confidence and learner autonomy [26]. This dichotomy highlights the nuanced role that generative AI plays in learning.

      AI can motivate in learning.

    7. Tools such as ChatGPT support full composition—ideation, structure, drafting, and revision—by providing immediate feedback, personalized guidance, and opportunities for iterative refinement

      ChatGPT is a useful tool.

    8. These capabilities build on earlier educational AI systems—such as intelligent tutoring—that have improved student outcomes [2]. However, widespread use raises concerns about academic integrity, epistemic reliability, and ethical governance [3,4].

      AI helps students but their are concerns about it's reliablilty.

    1. The relationship between faculty and students is like the relationship between a river and its water: In the short term, the river tells the water where to go, but in the long term, the water tells the river where to go.

      Telling students what to do is not easy.

    2. Students simply don't regard AI use as a serious academic crime, and certainly not one worth turning in each other for.

      Students won't turn in each other for using AI.

    3. This strategy, however, has also failed. In 2023, we concluded after some testing that AI detectors were not effective enough for NYU to license them or vouch for their results.

      Detecting AI has failed in 2023 at NYU.

    4. "If a professor tells me how to use AI, I'll use it that way, but if they tell me not to use it, I'll just use it and not tell."

      Completely getting rid of AI is difficult

    5. We can tell students to treat generative AI as if it were a human or organizational author all we want, but it isn't either of those things.

      It can be difficult to label using AI as plagarism.