44 Matching Annotations
  1. Jan 2020
    1. David Chalmers: Yeah, but I also think there are these sociological effects where most people think… we got this on the PhilPapers survey that most people think that most people think a certain thing, even though most people think the opposite.

      Inconsistency between first and second orderly beliefs among philosophers... in the survey there are many cases where people think everyone thinks A when actually most people think B.

  2. Nov 2019
  3. Sep 2019
    1. Estimating the Effect of Asking About Citizenship on the U.S. Census March 21, 2019, 1:21 pm

      This is a really interesting article in so many ways; it speaks to a larger political issue of our time, it uses an innovative method (an experiment!), and it follows a very generic and general structure of a social science research paper. Think of this as an ideal or prototype of social science research.

  4. Aug 2019
    1. How do you feel about writing in books?

      We welcome you to share your own answers to this question! (Opens in new window.)

      Below are participant responses to the survey questions.

      Click the "more" button at the bottom right corner of this annotation to see additional survey responses. You can also [view the results in a larger page] (https://wisc.pb.unizin.org/wiwgrangerized/back-matter/poll-results/). (Opens in new window)

      If you see error messages in the frames below, try opening this text in another browser.

      <iframe src="https://wisc.pb.unizin.org/wiwgrangerized/wp-admin/admin-ajax.php?action=h5p_embed&id=18" width="958" height="790" frameborder="0" allowfullscreen="allowfullscreen"></iframe><script src="https://wisc.pb.unizin.org/app/plugins/h5p/h5p-php-library/js/h5p-resizer.js" charset="UTF-8"></script>

      <iframe src="https://wisc.pb.unizin.org/wiwgrangerized/wp-admin/admin-ajax.php?action=h5p_embed&id=19" width="958" height="790" frameborder="0" allowfullscreen="allowfullscreen"></iframe><script src="https://wisc.pb.unizin.org/app/plugins/h5p/h5p-php-library/js/h5p-resizer.js" charset="UTF-8"></script>

      <iframe src="https://wisc.pb.unizin.org/wiwgrangerized/wp-admin/admin-ajax.php?action=h5p_embed&id=20" width="958" height="424" frameborder="0" allowfullscreen="allowfullscreen"></iframe><script src="https://wisc.pb.unizin.org/app/plugins/h5p/h5p-php-library/js/h5p-resizer.js" charset="UTF-8"></script>

      <iframe src="https://wisc.pb.unizin.org/wiwgrangerized/wp-admin/admin-ajax.php?action=h5p_embed&id=17" width="958" height="790" frameborder="0" allowfullscreen="allowfullscreen"></iframe><script src="https://wisc.pb.unizin.org/app/plugins/h5p/h5p-php-library/js/h5p-resizer.js" charset="UTF-8"></script>

  5. Jun 2019
    1. obstacles and challenges that were preventing faculty from adopting OERs in their courses—primarily a lack of time, lack of availability of relevant OERs, and a lack of ancillary materials to accompany existing open texts (test banks, PowerPoint files, and other materials often provided by commercial publishers).

      Survey these objections? Is there a way to put $$ to them?

    1. How many students are currently impacted by OERs at your institution? How many faculty are currently using OERs at your institution?

      Two good questions to add to surveys.

  6. May 2019
    1. general history journals, or in books or digital forums

      My beef is more with historians who don't even know they're doing it, and do things like put coded markers to interpretive structures into narratives in textbooks. Undergrads from other majors in surveys, who will never read historiography, miss these markers and don't realize they're reading a story told through a particular lens.

  7. Apr 2019
    1. statistically significant difference between the two groups in terms of the number of creditsthey took

      Validates the other survey claim that students facing expensive textbooks tend to take fewer.

    2. nosignificant differences between the overall results

      UCD study finds no statistical difference btw textbooks and OER. Baseline result suggesting no negative impact.

    3. what might be most notable about the OER adoption was its use as a catalyst fordeeper pedagogical change and professional growth.

      Another reason for admin. to support shift toward OER.

    4. students that had been taught by the same teacher

      Even when taught by same instructor!

    5. only 7 % of that group were ‘very familiar’ with open accesstextbooks, while 52 % were ‘not at all familiar’ with open access textbooks’

      Check this against our faculty.

  8. Mar 2019
  9. Feb 2019
    1. Due to our emotional distress measure having little prior validation, and our physical distress measure being entirely new, we first provide data to support the appropriateness of the two measures.

      An example of survey validation using Crombach's alpha.

  10. Nov 2018
    1. SurveyMonkey

      SurveyMonkey is a FREE survey platform that allows for the collection of responses from targeted individuals that can be easily collected and used to create reports and quantify results. SurveyMonkey can be delivered via email, mobile, chat, web and social media. The platform is easy to use and can be used as an add on for large CRMs such as Salesforce. There are over 100 templates and the ability to develop customized templates to suit your needs. www.surveymonkey.com

      RATING: 5/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

  11. Oct 2018
  12. www.projectinfolit.org www.projectinfolit.org
    1. telephone interviews with 37 participants

      I have to wonder at telephone samples of this age group given the propensity of youth to not communicate via voice phone.

  13. Sep 2018
  14. Feb 2018
  15. Sep 2017
  16. Feb 2017
  17. Apr 2016
  18. Feb 2016
    1. Zoomerang

      zoomerang survey software option

    2. While the display is appealing and easy to read, it is not customizable

      Polldaddy: survey software selection. List of cons.

    3. Polldaddy for iOS was a great option for this type of assessment. The layout and design are optimized for the iPad’s screen, and survey results are loaded offline. Be-cause an Internet connection is not required to administer a survey, the researcher has more flexibility in location of survey administration.

      Polldaddy did not require wireless internet access, making it a more flexible survey-software option

    4. Polldaddy software chosen for second iteration of survey offered at GSU for assessment.

    5. Google Forms

      Chosen survey-taking software for iPad surveys given to users at GSU.

    6. Over the past few years, the market research literature has reflected a concern about the quality of face-to-face market research as compared to online surveys and polls. Manfreda, Bosnjak, Berzelak, Haas, and Vehovar (2008) analyzed other studies that compared response rates from Web-based surveys to response rates of at least one other survey delivery method. Web survey responses were, on average, eleven percent lower than the other methods investigated. In their study of face-to-face survey responses as compared to online survey responses, Heerwegh and Looseveldt (2008) concluded that responses to Web surveys were of poorer quality and, overall, less suffi-cient than responses to surveys conducted face-to-face.

      face-to-face surveying produces greater results than web-based surveys.

  19. Oct 2015
    1. The Coming of OERRelated to the enthusiasm for digital instructional resources,four-fifths (81percent) of the survey participants agreethat “Open Source textbooks/Open Education Resource(OER) content “will be an important source for instructional resources in five yea
  20. Jan 2014
    1. Less than half (45%) of the respondents are satisfied with their ability to integrate data from disparate sources to address research questions

      The most important take-away I see in this whole section on reasons for not making data electronically available is not mentioned here directly!

      Here are the raw numbers for I am satisfied with my ability to integrate data from disparate sources to address research questions:

      • 156 (12.2%) Agree Strongly
      • 419 (32.7%) Agree Somewhat
      • 363 (28.3%) Neither Agree nor Disagree
      • 275 (21.5%) Disagree Somewhat
      • 069 (05.4%) Disagree Strongly

      Of the people who are not satisfied in some way, how many of those think current data sharing mechanisms are sufficient for their needs?

      Of the ~5% of people who are strongly dissatisfied, how many of those are willing to spend time, energy, and money on new sharing mechanisms, especially ones that are not yet proven? If they are willing to do so, then what measurable result or impact will the new mechanism have over the status quo?

      Who feel that current sharing mechanisms stand in the way of publications, tenure, promotion, or being cited?

      Of those who are dissatisfied, how many have existing investment in infrastructure versus those who are new and will be investing versus those who cannot invest in old or new?

      10 years ago how would you have convinced someone they need an iPad or Android smartphone?

    2. Reasons for not making data electronically available. Regarding their attitudes towards data sharing, most of the respondents (85%) are interested in using other researchers' datasets, if those datasets are easily accessible. Of course, since only half of the respondents report that they make some of their data available to others and only about a third of them (36%) report their data is easily accessible, there is a major gap evident between desire and current possibility. Seventy-eight percent of the respondents said they are willing to place at least some their data into a central data repository with no restrictions. Data repositories need to make accommodations for varying levels of security or access restrictions. When asked whether they were willing to place all of their data into a central data repository with no restrictions, 41% of the respondents were not willing to place all of their data. Nearly two thirds of the respondents (65%) reported that they would be more likely to make their data available if they could place conditions on access. Less than half (45%) of the respondents are satisfied with their ability to integrate data from disparate sources to address research questions, yet 81% of them are willing to share data across a broad group of researchers who use data in different ways. Along with the ability to place some restrictions on sharing for some of their data, the most important condition for sharing their data is to receive proper citation credit when others use their data. For 92% of the respondents, it is important that their data are cited when used by other researchers. Eighty-six percent of survey respondents also noted that it is appropriate to create new datasets from shared data. Most likely, this response relates directly to the overwhelming response for citing other researchers' data. The breakdown of this section is presented in Table 13.

      Categories of data sharing considered:

      • I would use other researchers' datasets if their datasets were easily accessible.
      • I would be willing to place at least some of my data into a central data repository with no restrictions.
      • I would be willing to place all of my data into a central data repository with no restrictions.
      • I would be more likely to make my data available if I could place conditions on access.
      • I am satisfied with my ability to integrate data from disparate sources to address research questions.
      • I would be willing to share data across a broad group of researchers who use data in different ways.
      • It is important that my data are cited when used by other researchers.
      • It is appropriate to create new datasets from shared data.
    1. In the course of your research or teaching, do you produce digital data that merits curation? 225 of 292 (77%) of respondents answered "yes" to this first question, which corresponds to 25% of the estimated population of 900 faculty and researchers who received the survey.

      For those who do not feel they have data that merits curation I would at least like to hear a description of the kinds of data they have and why they feel it does not need to be curated?

      For some people they may already be using well-curated data sets; on the other hand there are some people who feel their data may not be useful to anyone outside their own research group, so there is no need to curate the data for use by anyone else even though under some definition of "curation" there may be important unmet curation needs for internal-use only that may be visible only to grad students or researchers who work with the data hands-on daily.

      UPDATE: My question is essentially answered here: https://hypothes.is/a/xBpqzIGTRaGCSmc_GaCsrw

    2. Responsibility, myself versus others. It may appear that responses to the question of responsibility are bifurcated between "Myself" and all other parties combined. However, respondents who identified themselves as being responsible were more likely than not to identify additional parties that share that responsibility. Thus, curatorial responsibility is seen as a collaborative effort. (The "Nobody" category is a slight misnomer here as it also includes non-responses to this question.)

      This answers my previous question about this survey item:


    3. Awareness of data and commitment to its preservation are two key preconditions for successful data curation.

      Great observation!

    4. Which parties do you believe have primary responsibility for the curation of your data? Almost all respondents identified themselves as being personally responsible.

      For those that identify themselves as personally responsible would they identify themselves (or their group) as the only ones responsible for the data? Or is there a belief that the institution should also be responsible in some way in addition to themselves?

    5. Availability of the raw survey data is subject to the approval of the UCSB Human Subjects Committee.
    6. Survey design The survey was intended to capture as broad and complete a view of data production activities and curation concerns on campus as possible, at the expense of gaining more in-depth knowledge.

      Summary of the survey design

    7. To summarize the survey's findings: Curation of digital data is a concern for a significant proportion of UCSB faculty and researchers. Curation of digital data is a concern for almost every department and unit on campus. Researchers almost universally view themselves as personally responsible for the curation of their data. Researchers view curation as a collaborative activity and collective responsibility. Departments have different curation requirements, and therefore may require different amounts and types of campus support. Researchers desire help with all data management activities related to curation, predominantly storage. Researchers may be underestimating the need for help using archival storage systems and dealing with attendant metadata issues. There are many sources of curation mandates, and researchers are increasingly under mandate to curate their data. Researchers under curation mandate are more likely to collaborate with other parties in curating their data, including with their local labs and departments. Researchers under curation mandate request more help with all curation-related activities; put another way, curation mandates are an effective means of raising curation awareness. The survey reflects the concerns of a broad cross-section of campus.

      Summary of survey findings.