6 Matching Annotations
  1. Last 7 days
    1. Two original annotation\ 1.The stacks for the story telling is getting people to the point of the main story itself.

      1. The changes i see in the storytelling is how people perceptively explain themselves when it get to a bad part in the story they stay on that topic

      2. The main theme is about how they feel in the story what type of emoticon triggers that feeling if it's sadness or joyfulness

      3. I find in the story that they change emotion when they feeling sad or scared they go from happy to worry..

      4. in the beginning of the story they get the main purpose of what gonna happened next after the Introduction they can be either scared or worried about the person telling the story

      5. At the end the story teller can be feeling nervous about this whole story or relief they got it out in public they don't have to image the story or tell the story again.

      I would pick " "What is Implicit Bias?" because they talking about Mexican coming from their country to American and board control telling them if they illegal in that country it not okay to say since they're immigrants that came from their hometown just to get peace but not be mock by the person who trying send him back to where eh came from that mean a lot to me and my parents as African myself getting kicked for not fitting in with the Americans. I know what that feel liked

    1. Why do people think Ai very helpful? when it's clearly hurting them physically it just wrong thing to do. That why teacher check if student used ai for their assignment air cant be trusted no more these days.

    2. This data, while rich in information, contains both accurate and inaccurate content, as well as societal and cultural biases. Since these models mimic patterns in their training data without discerning truth, they can reproduce any falsehoods or biases present in that data

      THEY MAKE FALSE REPORT of Ai in general so people can make more money off this Ai is crazy they should have their own idea and not taking other fake people idea. They got it from their own knowledge just letting that just make them uneducated is not right

      Heran Yohannes

    3. Generative AI tools present similar problems. For example, a 2023 analysis of more than 5,000 images created with the generative AI tool Stable Diffusion found that it simultaneously amplifies both gender and racial stereotypes

      This came out to me mostly because Ai is bringing a bad habit to people in the work field and out it taking their idea and making a non existent idea that getting payed for

    1. For instance, you might set yourself the conscious goal to pay attention only to relevant information such as the content of the CV. You put aside all other distractions like your cellphone so that you can devote your full attention to the interview and you take enough time out of your schedule for making the decision.

      This one stand out to me too because no matter if you're qualify or unqualify you should always respect people for who they are not judge them if they're not able to work there or if they are you should thank them for their time and quality.

      By Heran yohannes

    2. Such an impact of race or gender would be an example of implicit bias. You are influenced in a systematic manner (i.e., you are biased) by elements in your environment (e.g., the skin color of the applicant) even though you did not intend to be influenced and were focusing on other things (i.e., it happened implicitly).

      This stand up to me mostly because I get judge for being black myself it not okay getting Judge someone to worked there because of there skin color and if they're black or white if it not illegal to work there. Everyone has the right the quote" NO MATTAR IF YOU'RE BLACK OR WHITE"

      Heran Yohannes