508 Matching Annotations
  1. Oct 2020
    1. Has the user used e-books or not? If user has used e-books, what did he/she like or not like about the platform or content? How easy was it to find and access e-books in the library’s collection?   Was the e-book vendor’s interface easy to use? How did the user use the e-book (to browse, for coursework, for example)?   What is the likelihood that the user will use an e-book in the future? 

      Also, given Primo, do they recognize that they are looking at a book and not a website?

    2. They can also provide information on why users do or do not use e-books or why they may prefer the interface or content of one e-book vendor over another.

      I need to get at this in order to cancel Ebsco or Proquest. Why do the librarians prefer the Proquest interface?

    3. libraries appear to favor the latter approach, examining usage statistics for electronic resources overall or on a title-by-title (or product) basis.

      because students frequently cannot say why they use a or b

    1. COUNTER reports are not specific; they may pertain to all titles that the vendorhas, all titles that libraries owns, or only those titles with uses.

      I don't understand what this means?

    2. Also reported multiple uses of a single title mayin fact be a single user navigating to a different part of the book which happens to be stored indifferent files

      oh that's bad.

    3. COUNTER defines and tracks sections of a book and how they are used,212R. FAGER ET AL.

      there's no standard definition of how a book should be split into sections

    4. imited scope of the COUNTER reports. Reports lack cost-per-use informationand use by publisher, subject, and acquisition type (e.g. single purchases, DDA, subscription)

      If reports lack cost-per-use info and use by publisher, subject, acquisition type (one time, DDA, subscription), then what was that report I saw in Alma?

    5. They are: (1) to aid in collection man-agement, (2) ascertain the effectiveness and appropriateness of particular purchases, (3) to ascer-tain the relevance of the collection criteria, and (4) to provide assessment to the administrationand funding agencies.

      Benefits for gathering stats: 1)aid in collection management 2)are purchases good? 3)relevance of collection criteria 4) proof for funders

    6. Nonetheless both studies noted liabilities with the COUNTER system. In addition, Yuan notesthe difficulties of comparing usage over time due to the numerous updates of the COUNTER sys-tem (Yuan et al.,2018, p. 34)

      This one I got! gold star for me!

    7. the importance of the age of the collection. What percentage of titleswere published in the last 5 years? What percentage of downloaded titles are less than 10 yearsold? What is the cost per title of those titles download within the last 5 years (

      I didn't think of that either, until Corey said it yesterday. I assumed it was up to-date

    8. lack of a common identifier across vendor offerings such as the ISBN number; the inability todistinguish between purchase, subscription and DDA unpurchased titles; and the difficulty intracking titles as they are added or removed from a vendor package (Conyers et al.,2017, p. 27)

      lack of common identifier really? they don't all have ISBNs?

      inability to distinguish between purchase, subscription and dda unpurchased titles. I didn't think of that.

    Annotators

    1. n addition, the majority of community colleges do not require research during the tenure process. Since research is not required to gain tenure or promotion, there are significantly fewer institutional services and supports for faculty research at community colleges.

      This strikes home. if they'd kept the associate dean of teaching, learning, and assessment, we could have had that support

    2. This last question is often contingent on whether your college has a department of Institutional Effectiveness or an IRB; BC had no IRB when I began designing the assessment, which made the process of determining how to gain consent to conduct human research for this assessmen

      snort. talk with Terry and Diana who ARE our IRB

    3. The library is contained on one floor, so we cannot use the easy solution of one quiet and one group study floor; instead, we moved all large tables to the front of the library and called the front half of the library our “hushed-talking zone” where students may talk quietly while studying together. The back half of the library is the “no-talking zone” for completely silent study. To help shift the culture of the library to reflect these two zones, we placed prominent signage at the entrances to each zone, and the librarians regularly walk around the library to check on noise levels. Noise complaints have decreased this semester now that students have a more consistent idea of how to use the library, and students seem much happier now that they have a clear way to tell how they should be using the library.

      Could we do this?

    4. The first was a lack of consistently quiet study space for students in the library and the second was a lack of awareness among faculty of the services they could use at the library.

      we have the same issues

    5. campus’ Assessment Committee as a method for non-academic units on campus to assess their services.

      now this seems like a good idea. Make Curriculum committee into an assessment committee

    6. I added a question to the general student survey related to each of BC’s Institutional Learning Outcomes (ILOs) to help determine whether students perceived that the library had helped them achieve these learning outcomes.

      Arguably more important than actual grades improvement, but something still feels wrong about this

    7. However, when I approached the college about this, the BC Admissions & Records Department cited FERPA concerns and denied my request

      I've been told that doesn't apply within institution. I wonder if this in and of itself is an assessment of the library's situation within the institution

    8. 85% indicated that they had noticed an improvement in their students’ research skills after the orientation.

      nice. But I worry about satisfaction surveys. The library in our head has a lion and a little old lady with round spectacles. We feel good about it, and us in there. Are they assessing the library they're actually in?

    9. The general survey to all BC students shows that 35% of respondents use the library multiple times per week, and over 82% of the respondents indicated that one of their primary reasons for using the library was as a quiet space to do homework,

      will this change after the pandemic? Will the pandemic ever end?

    10. extremely helpful as a tool to prove the sheer volume of BC student presence in the librar

      Was that surprising?<br> Was well-attended valued by administration? by the library staff?

    11. However, it is colloquially considered part of campus’ Student Services along with non-academic units such as the Writing Lab and the Tutoring Center.

      Why are these non-academic units?

    12. As a result, our library services do not simply orient them to the differences between an academic library and a high school or public library, but introduce them to the library as a concept.

      interesting. counter to my intuition

    13. Historically, library assessments have focused on library collections rather than library services.3 However, an argument can be made that a more effective form of library assessment focuses instead on student usage of library services and the degree to which student library usage relates to academic success

      This is set up as something radical and fighting the establishment, but I don't think that's been the case for a long time. She didn't have to use this rhetoric to make a case.

    14. Although always integral for service- oriented professions, assessment often serves a unique purpose for academic libraries in that it can also show how the library and its services impact student success and the college as a whole

      is this saying we're a service-oriented profession, or we're not? What do they mean by unique?

    15. The results of this study were quite positive, although gaps in service were identified and addressed post-assessment. The library plans to continue assessing its services every three years with this model of assessment.

      Hmmmm. Already grouchy

    16. Library assessment is a well-established necessity for proving a library’s worth to administration.

      I no longer believe this. Administrations make decisions based on feelings about worth and value. They use data to justify their feelings.

    Annotators

    1. Log Analysis Print book collections may be assessed by analyzing transaction logs stored in the library’s online catalog, using a variety of searches, including subject keyword, title and author. A log analysis can also be extremely helpful to identify research trends at the institution or by discipline.

      Just print? It seems like this could create a good picture of how people are using the collection as a whole? Can you filter for searches coming from library computers, classroom computers, non school computers/phones?

    2. These include subject bibliographies developed by another library, publisher’s comprehensive offerings, book dealer catalogs, list compiled by professional associations, lists of recently acquired books, and recommended reading lists, to name just a few. Verification studies use list-checking as a means for a group of libraries to compare their collections with a prepared list of titles, which in most cases, is a compilation of the most important works in a specific discipline

      are there standard lists for community colleges?

    1. base URL, which contains the address of the user's institutional link-server, followed by a query string, consisting of key-value pairs serializing a ContextObject. The ContextObject is most often bibliographic data, but as of version 1.0 OpenURL can also include information about the requester, the resource containing the hyperlink, the type of service required, and so forth. For example:

      there's a ton of info wrapped into this thing. how is it permanent?

    2. is often a bibliographic citation or bibliographic record in a database. Examples of these databases include Ovid Technologies,

      so it can apply to gated resources

    3. An OpenURL is similar to a web address, but instead of referring to a physical website, it refers to an article, book, patent, or other resource within a website.

      How does it know whether something is an article, etc. on a website? Is the open part the url itself or does it only point to open resources?

    1. Marxist analysis of OER in higher education cautions against administrators’ attempts to exploit OER for surplus value in the form of increased enrollment, lower teaching costs, and cultural prestige

      mother fucking exactly

    2. qualitative approaches used in OER perception studies could be incorporated more often to center students’ voices. Action research is another approach. According to Sagor, action research, “is a disciplined process of inquiry conducted by and for those taking the action. The primary reason for engaging in action research is to assist the ‘actor’ in improving and/or refining his or her actions” (Sagor, 2000). Action research on OER initiatives would be a welcome addition to the literature, as the method aligns nicely with critical pedagogy.

      What we were taught to do in the 90s

    3. Unlike most OER studies, Bowen, et al. also tested whether or not the OER/hybrid method can lower instructor costs. In their model, the hybrid course would be supervised by tenure-track faculty, with in-person sections led by “teaching assistants” and administrative work handled by a “part-time instructor” (Bowen, et al., 2012, p. 25).

      faculty work parsed out so that consultants do the most creative part and then everyone else gets raisins both from creative work and money

    4. Though the authors do not explain why professional development funds were so popular, the implication is that faculty relished the opportunity to share their work and learn from others in a community of practice.

      Faculty just want to live in the world of scholarship and brotherhood and don't care about rent or food. I don't buy it.

    5. One shortcoming of their report is it does not include any information about how Lumen Learning was involved in KOCI, especially with regard to MyOpenMath (MOM), a free, online course management system. It would be helpful to know if KOCI used the free version of MOM or the Lumen-supported version, Lumen OHM. Each option presents distinct cost and maintenance issues, namely vendor fees versus local maintenance expenses.

      how is this connected with faculty labor?

    6. The authors claim this was an even greater motivator than the stipend and they make explicit recommendations for other OER initiatives to allocate funds for conference attendance

      Really? maybe those faculty were well paid already

    7. This is understandable insofar as the focus of most studies is cost savings and student outcomes.

      Unless faculty are seen as workers rather than priests or buskers, their labor will always been seen as volunteer or in service of, and not needing pay

    8. academic freedom and of “corporate interference” since KOCI used Lumen Learning and received funding from the Bill and Melinda Gates Foundation and the Hewlett Foundation (

      this is where I wonder. there's so much about academic freedom that I revere but I can't see the implications. Sort of I do, in the contract stuff recently. the school is making decisions about what to pay faculty, based on what they use to teach

    9. found that students’ problem-solving abilities were slightly negatively impacted by the new homework system. The previous commercial system provided hints and tutorials as students completed their homework, whereas the new system simply provided correct/incorrect feedback.

      the commercial system better supported student learning? they can add more because their whole job is to create these materials. faculty are also teaching.

    10. The results are promising as the percentage of students receiving grades A, A-, and B+ in OER test courses increased dramatically for all three populations (Colvard, et al., 2018, pp. 269-271).

      But why? because in the other classes students weren't buying the books?

    11. They argue that shifting the cost burden away from taxpayers and onto students exacerbates ethnic and racial disparities in educational attainment.

      I'm beginning to feel that classroom copies of textbooks, purchased by the institution are the only option

    12. FWK and VSU students and faculty may have divergent ideas of what’s a fair price for a textbook. At the time of this writing, the FWK website lists most e-textbooks between $25-$30 and most print copies (ebook included) list for $55. This price is much lower than many commercial alternatives, but it is a lot more than free.

      there is an emotional not-evidence-based valuing going on. Students choose to spend money on other things that I might think is useless, like beer or clothes or drugs, but not textbooks. I would LOVE to see one of those behavioral economists take a look at this

    13. Such opacity is not helpful. For OER to flourish, it is important to name the resources being replaced, and their cost. Readers, especially those considering adopting OER, deserve to know these details to help them make informed decisions at their own institutions.

      Why? were they afraid of retaliation by publishers?

    14. I am not primarily concerned with how critical pedagogy is used in specific OER textbooks or learning materials. The below studies do not provide such granular detail. Instead, I am analyzing these studies for evidence, or lack thereof, of critical approaches to OER adoption and survey design as it relates to cost and access, pedagogy, and academic labor.

      not looking at how they're using critical pedagogy but whether people are adopting OER with

      cost and access pedagogy academic labor

    15. Often, student outcomes are similar across the test and control groups, though some studies present a case for correlation between cost and access and improved student outcomes.

      Outcomes are grades. and they are similar with OED or commercial, but are they good?

    16. social justice issues facing higher ed: cost and access, pedagogical practice, and academic labor

      defined as: cost and access pedagogical practice academic labor

    17. Eliminating expensive textbooks is a first step toward confronting the contradictions students and faculty face in higher ed

      It needs an equal and opposite: reading is personal power and freedom and choosing what to read is personal power and freedom

    18. If students cannot afford a textbook, they are already oppressed.

      the only power they can experience is rejecting the textbook and reading as a way to get learning

    19. how openness, when disconnected from its political underpinnings, could become as exploitative as the traditional system it had replaced”

      totally agree. It just seems like getting faculty to work for free

    20. For hooks, critical pedagogy means “creating [a] democratic setting where everyone feels a responsibility to contribute” (1994, p. 39). This practice requires a desire to transgress, to empower the oppressed through critical pedagogy: students of color, queer students, poor students.

      a responsibility to contribute asks that the learning has meaning for everyone in the room. Learning outcomes can't generate that meaning and responsibility

    21. Teachers minimize their authoritative role through a reconciliation of the teacher-student contradiction, “so that both are simultaneously teachers and students”

      this resonates with me so much

    22. Paulo Freire’s Pedagogy of the Oppressed (1968) and bell hooks’s Teaching to Transgress: Education as the Practice of Freedom (1994)

      I really need to read bell hooks

    23. acknowledges that high priced textbooks are a barrier to learning because many students do not purchase expensive textbooks

      I suspect students also make value judgments--this book is too expensive because that number on the price tag is too high, rather than my financial aid doesn't cover it.

    24. He added retain as the fifth R in 2014. As a practice, creators of any work should retain certain rights.

      I didn't know this. This is fascinating. Does SBCTC know this? Are they teaching this in the course?

    25. Lumen Learning is a company that provides a suite of educational technology products that colleges and universities pay to use; Lumen’s Candela, Waymaker and OHM provide the infrastructure for many instructors teaching with OER. While their products are often less expensive than commercial textbooks and platforms, some argue their business model betrays the ethos of open access initiatives

      Low cost, not free

    26. non-commercial learning materials. In 2012, UNESCO refined their definition to include “any type of educational materials that are in the public domain or introduced with an open license” (UNESCO, 2012)

      definitions shifted to emphasis on open license

    27. Measurables like student outcomes, while important, are too often foregrounded to appeal to administrators and funding organizations.

      are they important though? They feel so anti-learning

    28. Typically, ample attention is paid to a study’s design and methodology but the underlying institutional infrastructure and decision-making process is unexamined.

      good catch!

    1. Data can be loosely grouped into four categories: collections data, usagedata, user information, and citation data.

      We aren't actually doing this are we?

      collections usage user info citation data-- this one?

    2. Does the assessment goal help betteralign the collection with the institutional mission? Similarly, stakeholders—from users to colleaguesto administrators—must be considered. Who is the assessment for? Who will have to implement anydecisions that are made? Who will need to be convinced to take action?

      I really want to see a story about our collection.

    3. Instead, collections assessment should start with current institutional goals,the problem that needs to be solved, and an answer to the questionwhy?

      should institutional goals be the library or the college? or the general direction of the faculty?

      I would really like to do interviews with faculty about how they use the library and why they don't.

      My sense from Neal and Jeannette is that using our collection is too cumbersome.

    4. Collections can be broken down tolook at categories such as format, subject, or user. The holistic approach, on the other hand,incorporates elements of all formats and users, and generally uses mixed methods, includingquantitative and qualitative metrics. Holistic assessment can provide more well-rounded resultsthan narrower ad hoc approaches, since it attempts to build a high-level, overall view of thecollection.

      Holistic approach seems more my personality, but also seems like more avenues to variation and variability.

    5. Ad hoc assessment, on the other hand, involves conducting one-offassessment projects based on specific, finite needs.

      One-off assessment projects for specific needs. This is what I've been thinking I would do, but we really need the other thing. For the class, maybe I should design the systematic project.

    6. ystematic assessment is helpful fordeveloping a culture of assessment, making strategic decisions, and understanding large or rapidlygrowing collections (particularly in an environment with high staff turnover).

      This is helpful for developing a culture of assessment, and is helpful in an environment with high staff turnover. Sounds good for us.

    7. collections assessment, but generally they fall into one of twocategories: systematic or ad hoc.

      how does this map to formative/summative, collection-based/user-based?

    8. There is also the unknowable nature of how collections are used

      unknowable nature. It would be cool to ask how reading stuff from the collection changed a student experienced writing their research paper

    9. Outcomes, by contrast, are harder to quantify—and include student learning, facultyenrichment, research support, and so on. Historically, assessment has been very outputs-based, inpart because outputs are easy to measure, but also because (particularly in the all-print world of therecent past) the size, scope, and use of collections were how academic institutions valued theirlibraries. More recently the focus of institutions and their funding bodies has shifted toward studentand user outcomes, refocusing libraries’assessment efforts on an outcomes-based analysis. So aninstitution spent $60,000 on e-books—what did this actually do for the success of its users? Did ithelp them get a job, graduate more quickly, or complete a research study?

      meaning is now demanded

    10. Outputs are tangible, countable, concrete measures (such as number of circulations orCONTACTGenya O’Garagogara@gmu.eduPublished with license by Taylor & Francis Group, LLC© 2018 Madeline Kelly and Genya O’GaraTHE SERIALS LIBRARIAN2018, VOL. 74, NOS. 1–4, 19–29https://doi.org/10.1080/0361526X.2018.1428453

      Outputs v. outcomes. The outcome of our collection is that faculty use it to create rich learning opportunities for students. Don't think that's happening.

    Annotators

    1. FRBR

      What's FRBR? Functional Requirements for Bibliographic Records) is a 1998 recommendation of the International Federation of Library Associations and Institutions (IFLA) to restructure catalog databases to reflect the conceptual structure of information resources.

    1. Some of these sources (e.g., Alma’s Community Zone) may not contain ebook subject or subject heading information. My library uses OCLC’s free tool, Classify, to bring in LC class numbers and derive subject information.

      I don't really understand what she's saying. If I use the ebook platform, don't they have subjects? Is she saying that I would have to use Alma to compare the different ebooks and Alma community zone doesn't have subject headings?

    2. I am not sure how to compare print circulations to ebook usage statistics in Alma Analytics.

      I could ask to WACTLC or Wade. I should search in WACTLC emails first. I should read WACTCLC everyday.

    3. One thing often discussed when considering usage of print books versus e-books is whether ebook "circulations" are equivalent or comparable to print book checkouts. Each metric characterizes uses in different ways. For example, an ebook download or full-text access might be registered for each chapter or for an entire book. Depending on your source of usage statistics, it might be difficult to determine whether those uses were by the same person or a different person. How might these multiple uses compare to a single print book checkout for a period of several days?

      ebook chapter requests and investigations might not be comparable to a book check out. You can't tell if multiple chapters are the same person or different people.

    1. Madeline M. Kelly and Stephanie S. Smith.  "Assessing Collections Holistically: A  Behind-the-Scenes Approach" in Assessment Strategies in Technical Services, eds. Kimberley A. Edwards and Michelle Leonard. Chicago, ALA Editions, 2019: 25-63. 2.Peggy Johnson. Fundamentals of Collection Development and Management. Chicago: ALA Editions, 2018: 288 3. Joseph R. Matthews. The Evaluation and Measurement of Library Services. Santa Barbara, CA: Libraries Unlimited, 2018.  4. Amanda Waugh and Mega Subramaniam. "Interview and Focus Group Research" in Research Methods for Librarians and Educators, eds. Ruth V. Small and Marcia A. Mardis. Santa Barbara, CA: Libraries Unlimited, 2018: 37-49. 5. Selena Killick and Frankie Wilson. Putting Library Assessment Data to Work.  London: Facet Publishing, 2019: 31-81. 6. Lynn S. Connaway and Marie L. Radford. Research Methods in Library and Information Science. Santa Barbara, CA: Libraries Unlimited, 2017: 239-249. 7. Johnson, Fundamentals, 304. 7. Rebecca Teasdale. "Survey Research" in Research Methods for Librarians and Educators, eds. Ruth V. Small and Marcia A. Mardis. Santa Barbara, CA: Libraries Unlimited, 2018: 224. 8. Johnson, Fundamentals, 294. 9. Matthews, Evaluation and Measurement, 211. 10. Karen C. Kohn. Collection Evaluation in Academic Libraries.  Lanham, MD: Rowman & Littlefield, 2015. 11. Johnson, Fundamentals. 12. Madeline Kelly and Genya O’Gara. "Collections Assessment: Developing Sustainable Programs and Projects." The Serials Librarian 74, no. 1-4 (2018): 19-29. Further Reading Mike Allen. The SAGE Encyclopedia of Communication Research Methods.  4 vols. Thousands Oaks, CA: SAGE Publications, 2017. Li Chen and Jason Penwell. "Using Alma Analytics to Facilitate the Library Weeding Project." (2019).  Mitchell Dunkley. "Friendly Guides to COUNTER." (2019). Anna Dysart. "Aims and Approaches in Special Collections Assessment." RBM: A Journal of Rare Books, Manuscripts, and Cultural Heritage, 16, no. 2 (2015): 101-112. Tabitha Farney. Using Digital Analytics for Smart Assessment. Chicago: ALA Editions, 2018. Tony Greiner and Bob Cooper.  Analyzing Library Collection Use with Excel. Chicago: ALA Editions, 2007. Peter Hernon, Ellen Altman, and Robert E. Dugan. "Listening To Customers Through Surveys" in Assessing Service Quality: Satisfying the Expectations of Library Customers. Chicago: ALA Editions. (2015): 101-116.  Holly Hibner and Mary Kelly.  Making a Collection Count. Oxford: Chandos Publishing, 2013.  Marcia A. Mardis. "Evaluation of the Collection" in The Collection Program in Schools: Concepts and Practices. Santa Barbara, CA: Libraries Unlimited (2016): 169-186. Amalia Monroe-Gulick, Lea Currie and Corinne Forstot-Burke. "WorldShare Collection Evaluation: A Case Study." Technical Services Quarterly 36, no.1 (2019): 1-17. Oliver Pesch. "Implementing SUSHI and COUNTER: A Primer for Librarians." Serials Librarian 69, no. 2 (2015): 107–125. 

      a lot to read

    2. he Research Library Group (RLG) developed a methodology to describe library collections using codes that describe the level at which materials are collected, or should be collected.

      Would this be helpful in a community college setting?

    3. n the case of electronic resources, libraries may even be able to see how long a user has accessed content or where they were on the library website before going to a database.

      really? that would be really interesting!

    4. Some key issues to remember with user-centered data are that the period during which data are gathered, as well as the timing of performing the assessment are critical (Johnson 2018). Also, when employing user-centered assessment techniques, in many cases, it is not possible to use a large enough sample to generalize the results (Teasdale 2018, 224). If you are in an academic library, it may also be required to obtain permission from the Institutional Review Board (IRB) to conduct surveys or focus groups and make sure participants sign consent agreements.
      1. period during which data are gathered and time frame are critical WHY?
      2. User-centered assessment techniques aren't generalizable.
      3. Do we have IRB requirements?
    5. the person or group conducting the survey should carefully research survey design methods and craft effective questions to obtain meaningful results.

      This has been our problem so far. We just threw ourselves into it--put on a show! style.

    6. USE- OR USER-BASED COLLECTION-BASED Quantitative Quantitative ILL statistics Collection size and growth Circulation statistics Materials budget size and growth In-house use statistics Collection size standards and formulas Document delivery statistics Expenditures by Subject ILL statistics Comparison ratios  E-resources use statistics Content overlap analysis Qualitative Qualitative User opinion surveys (e.g. LibQual+,web-based,email) List checking (e.g., catalogs, bibliographies) User observation Verification studies Focus groups Citation analysis Interviews Direct collection checking Collection mapping (assigning conspectus levels) Commercial products (e.g., OCLC, Bowker, Ulrich's)

      e-resources use stats, user opinion surveys, user observation, focus groups, interviews.

      Collection based: collection size and growth, materials budget size and growth, collection size standards and formulas, content overlap analysis, (what are comparison ratios?) list checking, what are verification studies, direct collection checking collection mapping, commercial products?

    7. It is worth stating that there may be differences between librarians as to whether a particular technique is user-based or collection-based. For example, citation studies can be either depending on whether the library is seeking to understand what the user is doing or making a judgment about the quality of materials in the collection.

      User-based or collection-based can be the same technique, but depending on purpose of study

    1. Intensity Gen. Ed. outcome - % all sub-outcomes assessed at least once”

      the number of IL suboutcomes are the 5: Define and articulate a need for information. ; Locate, access and use information from a variety of sources. Identify the basic principles of how information is produced, stored, organized, transmitted and accessed; Critically evaluate information and its sources. Use information, considering the economic, legal, ethical and social issues surrounding its access and use.

      I think I'm beginning to understand something. I always thought of those 5 "suboutcomes" as headings and I thought people would choose from what may be sub sub outcomes. THIS IS THE PROBLEM

    2. ssessed the intensity of

      OMG, we are reaping the wages of sin! People pack gen ed outcomes into their courses to look good in curriculum committee and then don't teach them! Wait, am I being sensationalist or is this true?

    3. t may be that prior IL2 classes has low explanatory power because the sampled students have taken so few prior IL2 classes. By comparison, for the Global Awareness (GA) General Education Outcome assessed in 2016 sampled students had comparatively more prior GA classes, and prior classes were found to have a significant impact on score. Even in that case, it took 20 or more prior classes to move the average score from 5.2 to 6.

      20 or more! WHAT?

    4. Of the students sampled, we would expect those taking three or more IL intense classes to score average ratings above those taking fewer than three IL classes only about half the time. In other words, there is no effect at the three-course intensity threshold.

      I would love to have a discussion with teachers in IL intense classes on this finding, esp. part time teachers. Do they even know we have gen ed outcomes?

      Also, ugh. What is going on?

    5. We were unable to find a significant relationship between previous course work and Information Literacy scores in this assessment.

      Possibilities: previous course work doesn't teach the info lit gen ed outcome. previous course work doesn't teach the info lit gen ed outcome WELL.

      I wonder which instructors would raise their hands if we asked who teaches an info lit heavy course?

    6. first worked together to norm a sample of 8 o

      I remember I struggled with every part of the norming. My standards were different from others and I felt pressure to give in and conform. I hated it.

    7. the form of essays, research proposals, bibliographies, poster projects, and written notes for informative speeches. A total of 87 samples were collected and de-identified before being rated by five assessors on meeting two information literacy sub-outcomes.

      A variety of artefacts. 87 sames. What does de-identified mean?

    8. Information literacy is low in the students sampled, and we did not find that prior coursework is related to scores on the assessment.

      This gen ed outcome is not being accomplished. Why? Does it matter?

    Annotators

    1. A few of these disadvantages can be mitigated with thoughtful survey design, but they will likely remain inherent weaknesses of this technique.

      exposes and explores a weakness in the study. I kind of wish they would have said why they are going to do it anyway

    2. . Follow-up phone interviews with regional survey respondents might also be a useful tool to consider.

      explores possibilities rather than saying for sure they'll do x

    3. alue of the library’s physical journal collection to the larger health sciences community, however, is the more compelling question. ILL lending statistics can provide insight into the immediate value of the collection, but the larger question being asked pertains to its long-term value: what role might TML’s physical journal collection have in serving as a research library of record?

      assessing the VALUE of the collection to a particular community

    4. analysis of document delivery statistics is the most direct methodology for gaining insight on the physical journal collection’s value to the institution

      Do I really know what the difference between document delivery and ILL is?

    5. ML’s journals comprise over 60% of the physical collection’s linear footage within the building. With the anticipated library move in the next several years, it will be important to identify the importance of TML’s print journals to VCU’s health sciences community as well as the national and regional health sciences research community, and then plan a weeding project accordingly.

      thank you for the purpose clearly stated in the first sentence

    6. In five to ten years, TML is slated to move into a new building that will require the majority of its physical collection to be stored elsewhere. VCU Libraries is in the process of securing a permanent storage space to accommodate TML’s impending move, as well as the critical space needs of Cabell Library, which is on the general academic campus. Items from both libraries are currently stored in Cabell Library’s basement and in a temporary storage facility

      the problem

    1. While we depend mostly on the circulation statistics, they don’t provide the whole story. What happens if the statistics are low, but that isn’t from lack of demand, it could be from lack of available materials. We need to know what the users actually want – and from their own voices. In the past (during the time we were creating the next strategic plan), we had used both user surveys and focus groups successfully. While that was more services-based needs, I believe we can have the same success (or more!) using them for collection-based needs

      describes the thinking; didactic

    2. ser-based and collection-based approaches and a mix of both quantitative and qualitative tools. User-based approaches will include circulation and in-house statistics (quantitative) in addition to user opinion surveys, focus groups, and user observation (qualitative).

      All the stuff: user and collection-based; quantitative and qualitative. circ and in-house, opinion, focus groups, user observation.

    3. The library management team will then use the findings to help justify any budget proceedings needed regarding the children’s collection in the future.

      to get money from funders

    1. he estimate should include how much free access is allowed by each vendor, if any, and what restrictions there are during such a trial period. “Opening day” lists from the library’s wholesaler would be helpful and would likely be free.

      What is this?

    2. Both use/user-based and collection-based approaches will be employed, with measures being mostly quantitative (usage statistics) with two qualitative ones (survey of local residents and checking of collection against core lists).

      summative and formative. qualitative and quantitative. qualitative uses two methods: survey and core lists

    3. measuring the use of the electronic and print collections, analyzing user satisfaction with the collection, placing the results in the context of the 2017 strategic plan, and making recommendations about budget allocations and changes to

      Dang! It's an assessment of everything, with strategic plan and budget, and collection dev. practice and policyd! Ambitious.

    1. Methodology

      Uses both collection centered and user centered, with emphasis on user-centered.

      Describes why she won't use certain types of analysis. I like that!

    2. The teamCollections Management Co-ordinator (me)oI oversee the projects transferring the materials into storage and have a detailed knowledge of the workflows and materials involved.oI have access to the Voyager database and experience analyzing circulation & collection data.Collections Management & Planning LibrarianoSupervises my work, has extensive knowledge of the Voyager database and access tothe HK database (HK is the software that runs the ASRS) and In-House Use data.Subject librarian for one of the involved disciplinesoIdeally our Reference Librarian for biology subject areas, who was involved with the ASRS transfers from her collection.Other parties:oThe Assessment Librarian may be involved consultatively to provide existing assessment data and help connect this specific project with his work.oWe may need support from LSIT (Library Systems and Information Technology) in working with the databases.

      describes each person participating and the skills/knowledge they have that will be contributed to project

    3. anbe carried out within current staff time and by using existing resources. Approval will be needed from the supervisors, department heads, and the Associate University Librarians for Collections Management and Client Services and Programs to allocate the staff time required. I expect to spend 4-6 hours per week on this project for approximately 8 weeks, and would not expect my teammates to

      Budget describes time per week. I really like that!

    4. is project will focus materials in multiple subject areas, under the divisions of Science & Engineering and Biomedical Sciences. Over 200,000 itemsin these collectionshave recently been transferred from open stacks in two large library branches into theASRS storage system. The criteria for transfer included pre-2000 publication datesand low circulation statistics,generally thresholds of either three or five circulations since 2004(which is we began using the Voyager ILS).The majority of materials transferred had zero circulations.

      describes the items that will be examined

    5. Results will be reported to the individuals mentioned as well as Library Administration and the Library staff

      She's going to tell the higher ups, but she doesn't say what she thinks should be done as a result of the project.

    6. large academic library with 19 branches and divisions, and over 6.4 million items.1UBC Library serves undergraduate and graduate students, staff, faculty, and community members. The Library has nearly four million annual visits and nearly seven million visits to the Library website

      type of library, size, number of items, who it serves, visits to the website, strategic directions, context of the project

    Annotators

    1. his assessment would provide library administration and professional librarians with evidence to support decision making. In addition, faculty members in the Faculty of Arts, as well as members of the Humanities and Social Sciences Library Advisory Board, and the Senate Committee on Libraries are the intended audience of the report resulting from this project

      Looks like the librarian already has an agenda

    2. he print collection of the Humanities and Social Sciences Library, the largest of McGill’s branch libraries, is presently at working capacity. At the same time, the growing student population is in need of additional quiet study and collaborative work spaces.

      quiet study and collaborative spaces

    3. Environment

      Headings include: environment, purpose, audience, scope of the project, timeline, who will be involved, methodology used and why, presentation of results,

    Annotators

    1. Borin, Jacqueline and Hua Yi (2011). Assessing an academic library collection through capacity and usage indicators: testing a multi-dimensional model. Collection Building, 30 (3): 120-125Bucknell, Terry (2012). Garbage In, Gospel Out: twelve reasons why librarians should not accept cost-per-download figures at face value. The Serials Librarian, 63 (2): 192-212.Coombs, Karen A. (2005). Lessons learned fromanalyzing library database usage data. Library Hi Tech, 23 (4), 598-609.Tucker, Cory (2009). Benchmarking usage statistics in collection management decisions for serials. Journal of Electronic Resources Librarianship, 21: 48-6

      should I read these? Especially garbage in, gospel out and benchmarking usage stats!

    2. Once this project is complete, the business librarian will utilize the data to conduct a qualitativeassessment by interviewing faculty and conducting focus groups with students in the business college

      sets up a qualitative study to follow quantitative study

    3. user-centered quantitative approach with an analysis of 2011 and 2012 use statistics for e-resources

      say the style of assessment with the exact thing you're going to do

    Annotators