105 Matching Annotations
  1. Nov 2019
  2. Sep 2019
    1. “But then again,” a person who used information in this way might say, “it’s not like I would be deliberately discriminating against anyone. It’s just an unfortunate proxy variable for lack of privilege and proximity to state violence.

      In the current universe, Twitter also makes a number of predictions about users that could be used as proxy variables for economic and cultural characteristics. It can display things like your audience's net worth as well as indicators commonly linked to political orientation. Triangulating some of this data could allow for other forms of intended or unintended discrimination.

      I've already been able to view a wide range (possibly spurious) information about my own reading audience through these analytics. On September 9th, 2019, I started a Twitter account for my 19th Century Open Pedagogy project and began serializing installments of critical edition, The Woman in White: Grangerized. The @OPP19c Twitter account has 62 followers as of September 17th.

      Having followers means I have access to an audience analytics toolbar. Some of the account's followers are nineteenth-century studies or pedagogy organizations rather than individuals. Twitter tracks each account as an individual, however, and I was surprised to see some of the demographics Twitter broke them down into. (If you're one of these followers: thank you and sorry. I find this data a bit uncomfortable.)

      Within this dashboard, I have a "Consumer Buying Styles" display that identifies categories such as "quick and easy" "ethnic explorers" "value conscious" and "weight conscious." These categories strike me as equal parts confusing and problematic: (Link to image expansion)

      I have a "Marital Status" toolbar alleging that 52% of my audience is married and 49% single.

      I also have a "Home Ownership" chart. (I'm presuming that the Elizabeth Gaskell House Museum's Twitter is counted as an owner...)

      ....and more

  3. Jul 2019
    1. We translate all patient measurements into statisticsthat are predictive of unsuccesfull discharge

      Egy analitikai pipeline, kb amit nekünk is össze kéne hozni a végére.

  4. Apr 2019
    1. Annotation Profile Follow learners as they bookmark content, highlight selected text, and tag digital resources. Analyze annotations to better assess learner engagement, comprehension and satisfaction with the materials assigned.

      There is already a Caliper profile for "annotation." Do we have any suggestions about the model?

  5. Mar 2019
  6. Feb 2019
    1. Which segments of text are being highlighted?

      Do we capture this data? Can we?

    2. What types of annotations are being created?

      How is this defined?

    3. Who is posting most often? Which posts create the most replies?

      These apply to social annotation as well.

    4. Session Profile

      Are we capturing the right data/how can Hypothesis contribute to this profile?

    5. Does overall time spent reading correlate with assessment scores? Are particular viewing patterns/habits predictive of student success? What are the average viewing patterns of students? Do they differ between courses, course sections, instructors, or student demographics?

      Can H itself capture some of this data? Through the LMS?

  7. Dec 2018
    1. And while content analytics tools (e.g., Chartbeat, Parsely, Content Insights) and feedback platforms (e.g., Hearken, GroundSource) have thankfully helped close the gap, the core content management experience remains, for most of us, little improved when it comes to including the audience in the process.
  8. Jul 2018
  9. May 2018
    1. hi there check on the SAS Training and Tutorial with better analysis On the Data and forecasting methods for better implication on Business analytics

      https://www.youtube.com/watch?v=1QPRhVGCTRE

  10. Mar 2018
  11. Jan 2018
  12. Nov 2017
    1. Mount St. Mary’s use of predictive analytics to encourage at-risk students to drop out to elevate the retention rate reveals how analytics can be abused without student knowledge and consent

      Wow. Not that we need such an extreme case to shed light on the perverse incentives at stake in Learning Analytics, but this surely made readers react. On the other hand, there’s a lot more to be said about retention policies. People often act as though they were essential to learning. Retention is important to the institution but are we treating drop-outs as escapees? One learner in my class (whose major is criminology) was describing the similarities between schools and prisons. It can be hard to dissipate this notion when leaving an institution is perceived as a big failure of that institution. (Plus, Learning Analytics can really feel like the Panopticon.) Some comments about drop-outs make it sound like they got no learning done. Meanwhile, some entrepreneurs are encouraging students to leave institutions or to not enroll in the first place. Going back to that important question by @sarahfr: why do people go to university?

    1. Information from this will be used to develop learning analytics software features, which will have these functions: Description of learning engagement and progress, Diagnosis of learning engagement and progress, Prediction of learning progress, and Prescription (recommendations) for improvement of learning progress.

      As good a summary of Learning Analytics as any.

  13. Oct 2017
    1. Examples of such violence can be seen in the forms of an audit culture and empirically-driven teaching that dominates higher education. These educational projects amount to pedagogies of repression and serve primarily to numb the mind and produce what might be called dead zones of the imagination. These are pedagogies that are largely disciplinary and have little regard for contexts, history, making knowledge meaningful, or expanding what it means for students to be critically engaged agents.

      On audit culture in education. How do personalized/adaptive/competency-based learning and learning analytics support it?

    1. By giving student data to the students themselves, and encouraging active reflection on the relationship between behavior and outcomes, colleges and universities can encourage students to take active responsibility for their education in a way that not only affects their chances of academic success, but also cultivates the kind of mindset that will increase their chances of success in life and career after graduation.
    2. If students do not complete the courses they need to graduate, they can’t progress.

      The #retention perspective in Learning Analytics: learners succeed by completing courses. Can we think of learning success in other ways? Maybe through other forms of recognition than passing grades?

  14. Sep 2017
  15. Aug 2017
    1. This has much in common with a customer relationship management system and facilitates the workflow around interventions as well as various visualisations.  It’s unclear how the at risk metric is calculated but a more sophisticated predictive analytics engine might help in this regard.

      Have yet to notice much discussion of the relationships between SIS (Student Information Systems), CRM (Customer Relationship Management), ERP (Enterprise Resource Planning), and LMS (Learning Management Systems).

  16. Mar 2017
    1. The plan should also include a discussion about any possible unintended consequences and steps your institution and its partners (such as third-party vendors) can take to mitigate them.

      Need to create a risk management plan associated with the use of predictive analytics. Talking as an organization about the risks is important - that way we can help keep each other responsible for using analytics in a responsible way.

    1. we think analytics is just now trying to get past the trough of disillusionment.
    2. Analytics isn’t a thing. Analytics help solve problems like retention, student success, operational efficiency, or engagement.
    3. Analytics doesn’t solve a problem. Analytics provides data and insight that can be leveraged to solve problems.
  17. Feb 2017
    1. this kind of assessmen

      Which assessment? Analytics aren't measures. We need to be more forthcoming with faculty about their role in measuring student learning. Such as, http://www.sheeo.org/msc

  18. Dec 2016
    1. your own website remains your single greatest advantage in convincing donors to stay loyal or in drawing new supporters to your cause. Do you know what donors are looking for when they land on your site? Are you talking about it?

      This underscores the importance of using Google Analytics, as well as website user surveys.

  19. Nov 2016
  20. Oct 2016
    1. Devices connected to the cloud allow professors to gather data on their students and then determine which ones need the most individual attention and care.
    1. For G Suite users in primary/secondary (K-12) schools, Google does not use any user personal information (or any information associated with a Google Account) to target ads.

      In other words, Google does use everyone’s information (Data as New Oil) and can use such things to target ads in Higher Education.

  21. Sep 2016
    1. Data sharing over open-source platforms can create ambiguous rules about data ownership and publication authorship, or raise concerns about data misuse by others, thus discouraging liberal sharing of data.

      Surprising mention of “open-source platforms”, here. Doesn’t sound like these issues are absent from proprietary platforms. Maybe they mean non-institutional platforms (say, social media), where these issues are really pressing. But the wording is quite strange if that is the case.

    2. Activities such as time spent on task and discussion board interactions are at the forefront of research.

      Really? These aren’t uncontroversial, to say the least. For instance, discussion board interactions often call for careful, mixed-method work with an eye to preventing instructor effect and confirmation bias. “Time on task” is almost a codeword for distinctions between models of learning. Research in cognitive science gives very nuanced value to “time spent on task” while the Malcolm Gladwells of the world usurp some research results. A major insight behind Competency-Based Education is that it can allow for some variance in terms of “time on task”. So it’s kind of surprising that this summary puts those two things to the fore.

    3. Research: Student data are used to conduct empirical studies designed primarily to advance knowledge in the field, though with the potential to influence institutional practices and interventions. Application: Student data are used to inform changes in institutional practices, programs, or policies, in order to improve student learning and support. Representation: Student data are used to report on the educational experiences and achievements of students to internal and external audiences, in ways that are more extensive and nuanced than the traditional transcript.

      Ha! The Chronicle’s summary framed these categories somewhat differently. Interesting. To me, the “application” part is really about student retention. But maybe that’s a bit of a cynical reading, based on an over-emphasis in the Learning Analytics sphere towards teleological, linear, and insular models of learning. Then, the “representation” part sounds closer to UDL than to learner-driven microcredentials. Both approaches are really interesting and chances are that the report brings them together. Finally, the Chronicle made it sound as though the research implied here were less directed. The mention that it has “the potential to influence institutional practices and interventions” may be strategic, as applied research meant to influence “decision-makers” is more likely to sway them than the type of exploratory research we so badly need.

    1. often private companies whose technologies power the systems universities use for predictive analytics and adaptive courseware
    2. the use of data in scholarly research about student learning; the use of data in systems like the admissions process or predictive-analytics programs that colleges use to spot students who should be referred to an academic counselor; and the ways colleges should treat nontraditional transcript data, alternative credentials, and other forms of documentation about students’ activities, such as badges, that recognize them for nonacademic skills.

      Useful breakdown. Research, predictive models, and recognition are quite distinct from one another and the approaches to data that they imply are quite different. In a way, the “personalized learning” model at the core of the second topic is close to the Big Data attitude (collect all the things and sense will come through eventually) with corresponding ethical problems. Through projects vary greatly, research has a much more solid base in both ethics and epistemology than the kind of Big Data approach used by technocentric outlets. The part about recognition, though, opens the most interesting door. Microcredentials and badges are a part of a broader picture. The data shared in those cases need not be so comprehensive and learners have a lot of agency in the matter. In fact, when then-Ashoka Charles Tsai interviewed Mozilla executive director Mark Surman about badges, the message was quite clear: badges are a way to rethink education as a learner-driven “create your own path” adventure. The contrast between the three models reveals a lot. From the abstract world of research, to the top-down models of Minority Report-style predictive educating, all the way to a form of heutagogy. Lots to chew on.

  22. Jul 2016
    1. what do we do with that information?

      Interestingly enough, a lot of teachers either don’t know that such data might be available or perceive very little value in monitoring learners in such a way. But a lot of this can be negotiated with learners themselves.

    2. E-texts could record how much time is spent in textbook study. All such data could be accessed by the LMS or various other applications for use in analytics for faculty and students.”
    3. not as a way to monitor and regulate
    1. which applicants are most likely to matriculate
    2. Data collection on students should be considered a joint venture, with all parties — students, parents, instructors, administrators — on the same page about how the information is being used.
    3. "We know the day before the course starts which students are highly unlikely to succeed,"

      Easier to do with a strict model for success.

    1. improving teaching, not amplifying learning.

      Though it’s not exactly the same thing, you could call this “instrumental” or “pragmatic”. Of course, you could have something very practical to amplify learning, and #EdTech is predicated on that idea. But when you do, you make learning so goal-oriented that it shifts its meaning. Very hard to have a “solution” for open-ended learning, though it’s very easy to have tools which can enhance open approaches to learning. Teachers have a tough time and it doesn’t feel so strange to make teachers’ lives easier. Teachers typically don’t make big purchasing decisions but there’s a level of influence from teachers when a “solution” imposes itself. At least, based on the insistence of #BigEdTech on trying to influence teachers (who then pressure administrators to make purchases), one might think that teachers have a say in the matter. If something makes a teaching-related task easier, administrators are likely to perceive the value. Comes down to figures, dollars, expense, expenditures, supplies, HR, budgets… Pedagogy may not even come into play.

  23. Jun 2016
    1. nothing we did is visible to our analytics systems

      If it’s not counted, does it count?

    1. Massively scaling the reach and engagement of LinkedIn by using the network to power the social and identity layers of Microsoft's ecosystem of over one billion customers. Think about things like LinkedIn's graph interwoven throughout Outlook, Calendar, Active Directory, Office, Windows, Skype, Dynamics, Cortana, Bing and more.

      The integration of external social/collab data with internal enterprise data could be really powerful, and open up interesting opportunities for analytics.

    1. It shifted its work to faculty-driven initiatives.

      DIY, grassroots, bottom-up… but not learner-driven.

    2. learning agenda on learning analytics
    3. Learning analytics cannot be left to the researchers, IT leadership, the faculty, the provost or any other single sector alone.
    4. An executive at a large provider of digital learning tools pushed back against what he saw as Thille’s “complaint about capitalism.”

      Why so coy?

      R.G. Wilmot Lampros, chief product officer for Aleks, says the underlying ideas, referred to as Knowledge Space Theory, were developed by professors at the University of California at Irvine and are in the public domain. It's "there for anybody to vet," he says. But McGraw-Hill has no more plans to make its analytics algorithms public than Google would for its latest search algorithm.

      "I know that there are a few results that our customers have found counterintuitive," Mr. Lampros says, but the company's own analyses of its algebra products have found they are 97 percent accurate in predicting when a student is ready to learn the next topic.

      As for Ms. Thille's broader critique, he is unpersuaded. "It's a complaint about capitalism," he says. The original theoretical work behind Aleks was financed by the National Science Foundation, but after that, he says, "it would have been dead without business revenues."

      MS. THILLE stops short of decrying capitalism. But she does say that letting the market alone shape the future of learning analytics would be a mistake.

    5. a debate over who should control the field of learning analytics

      Who Decides?

    1. What teachers want in a data dashboard

      Though much of it may sound trite and the writeup is somewhat awkward (diverse opinions strung together haphazardly), there’s something which can help us focus on somewhat distinct attitudes towards Learning Analytics. Much of it hinges on what may or may not be measured. One might argue that learning happens outside the measurement parameters.

    2. timely

      Time-sensitive, mission-critical, just-in-time, realtime, 24/7…

    3. Data “was something you would use as an autopsy when everything was over,” she said.

      The autopsy/biopsy distinction can indeed be useful, here. Leading to insight. Especially if it’s not about which one is better. A biopsy can help prevent something in an individual patient, but it’s also a dangerous, potentially life-threatening operation. An autopsy can famously identify a “cause of death” but, more broadly, it’s been the way we’ve learnt a lot about health, not just about individual patients. So, while Teamann frames it as a severe limitation, the “autopsy” part of Learning Analytics could do a lot to bring us beyond the individual focus.

    1. While generally misused today, analytics can (theoretically) be used to predict and personalize many facets of teaching & learning, inc. pace, complexity, content, and more.
  24. May 2016
    1. The entirely quantitative methods and variables employed by Academic Analytics -- a corporation intruding upon academic freedom, peer evaluation and shared governance -- hardly capture the range and quality of scholarly inquiry, while utterly ignoring the teaching, service and civic engagement that faculty perform,
  25. Apr 2016
    1. “fundamentally if we want to realize the potential of human networks to change how we work then we need analytics to transform information into insight otherwise we will be drowning in a sea of content and deafened by a cacophony of voices”

      Marie Wallace's perspective on the potential of bigdata analytics, specifically analysis of human networks, in the context of creating a smarter workplace.

  26. Mar 2016
  27. Dec 2015
    1. focus groups where students self-report the effectiveness of the materials are common, particularly among textbook publishers

      Paving the way for learning analytics.

    2. It’s educators who come up with hypotheses and test them using a large data set.

      And we need an ever-larger data set, right?

    3. a good example of the kind of insight that big data is completely blind to

      Not sure it follows directly, but also important to point out.

    1. I will investigate the details on this, including the relevant contractual clauses, when I get the chance.
    2. taking a swipe at Knewton

      Snap!

    3. they are making a bet against the software as a replacement for the teacher and against big data
    4. a set of algorithms designed to optimize the commitment of knowledge to long-term memory
    1. The most popular project for the MUA to tackle was Learning Analytics

      Although Dougiamas claims Moodle already has what is needed in the form of logs and reports: no need for Caliper or xAPI.

    1. Lambda Solutions [Corrected.]

      Oh? They were quite present at MoodleMoot. Wonder what their ties are. Clearly, their solution isn’t free software. Nor is it pushing Open Standards for Learning Analytics.

  28. Nov 2015
    1. grows exponentially.

      As we get into “Big Data”, individual datapoints become less important.

    2. What is the correlation between levels of student responses to each other and outcomes?

      Levels and types of responses. Just read such an analysis, based on Brookfield and Preskill’s “Conversational Moves”.

    3. read by any Caliper-compliant system

      Or any Learning Record Store.

    4. Caliper WordPress plugin

      How long before we get such a thing?

    5. most blogs have a feature called “pingbacks,”

      Annotations should have “pingbacks”, too. But the most important thing is how to process those later on. We do get into the Activity Streams behind much Learning Analytics.

    1. Personal Learning Record will define how to represent, capture and leverage user activity, including ratings, test results and performance measures in a distributed learning and work environment.
  29. Sep 2015
    1. Commercial publishers and content producers say there's reason to doubt the quality of open resources

      Have they demonstrated so clearly that their textbooks have enhanced learning? Oh, wait. They set the criteria by which we assess learning and push for their own brand of Learning Analytics, so…

  30. Aug 2015
  31. Apr 2015
    1. But as this escalated, local governments and police lost sight of their original purpose and focused more and more on punching up the number of arrests and punishments — rather than helping impoverished communities rise up.

      True in education as well.

  32. Jan 2015