1,653 Matching Annotations
  1. Feb 2019
    1. Average power consumption can be estimated from single points, although this results in increasing uncertainty. Manufacturers publish the maximum measured electricity (MME)value, which is the maximum observed power consumption by a server model. The MME can often be calculated with online tools, which may allow the specification of individual components for a particular server configuration. Based on these estimations of maximum power consumption, the average power consumption is commonly assumed to be 60 percent of MME for high-end servers and 40 percent for volume and mid-range servers.

      okay, this is a useful stat. I think

    2. In most cases, the “embodied emissions” (all stages excluding the use stage) of software are not significant compared with the overall emissions of the ICT system, particularly when the embodied emissions caused by development of the software are amortized over a large number of copies. In these cases, it is not necessary to carry out a detailed life cycle assessment of the software as part of a wider system. An exception is where bespoke software has very high emissions associated with its development, and these emissions are all allocated to a small number of software copies.

      Smaller, internal software might count

    3. Currently, input-output(IO)tables are published every five years, a long time in IT product evolution. Consequently, EEIO is good at representing basic commodities / materials industries like plastics or metals manufacturing, but not high-tech industries like microprocessors and fiber optic lasers manufacturing.

      Every 5 years. So, when the iPhone 6 was the brand new hotness, compared to today.

    4. Calculating cradle-to-gate GHG emissions of IH by the component characterization method

      Ah! This is new to m

    5. EEIO tables are updated infrequently thus may not be up to date with ICT’s newest technologies and materials. EEIO tables have limited resolution at the aggregate sector level.

      Valuable problem to solve?

    6. rapidly with the onset of innovations, but lag in being included in EEIO databases available to the practitioner. More detail on EEIO data is provided in the calculation sections below

      Useful point. Because the top-down data is lagging, it'll give worse than expecred figures for hardware

    7. It is interesting to note that the figures from GSMA and GeSI show that energy intensity per gigabyte is improving at about 24% per year for mobile networks, and at about 22% per year for fixed line networks.(The study by Aslan et al calculates a figure of 50% reduction in energy intensity every two years for fixed line networks, equivalent to 29% reduction per year).Also the data shows that the energy intensity per gigabyte for mobile networks is about 50 times that for fixed line networks.

      Okay, this isn't that far from the 45x figure before

    8. Assuming that the reduction in energy efficiency can be fitted to an exponentially decreasing curve (i.e. because it is more and more difficult to achieve the same reductions), then the data points can be extrapolated to give energy intensity factors for 2015 of 0.15 for fixed linenetworks, and 6.5 for mobile networks, with both factors measured in kWh/GB (kilowatt-hours per gigabyte).

      FORTY FIVE TIMES MORE ENERGY INTENSIVE THAN WIRED

    9. A simple energy intensity factor for the use of the internet would make calculating the emissions resulting from ICT simpler and more widely accessible. Whilst this has been attempted in the past, resulting estimates show huge disparities. Coroama and Hilty6review 10 studies that have attempted to estimate the average energy intensity of the internet where estimates varied from 0.0064 kWh/GB to 136 kWh/GB, a difference factor of more than 20,000.

      TWENTY THOUSAND TIMES DIFFERENCE.

      Did I drive across London? Or did I drive to the moon?

    10. Typically, for cloud and data center services the largest impacts are from the use stage emissions of the data center and the end user devices.

      End user devices?

    11. Simplified parameters for allocation of data center emissions include

      Useful for the screening stage

    12. Optional processes that are not attributable to the GHG impact of cloud and data center services are:

      ?

    13. The end-of-life stage typically represents only -0.5 to -2 percent of a service’s total GHG emissions. This is because of the high level of recycling of network equipment.

      Where would I check to learn this?

    14. For global average electricity emission factors across 63 countries where the points of presence were located, data from the Carbon Trust Footprint Expert Database was used.

      Is this database free?

    15. Operational activities and non-ICT support equipment covers people (labor)-activities and non-ICT support equipment and activities that are directly engaged and dedicated to the service being assessed.

      in addition to the other emissions

    16. For example, these measurements might involve running a series of traffic traces16over a period of time to build up statistics on network parameters. The measurements also need to include the energy consumption for the network’s ancillary equipment such as cooling, power conditioning, and back-up power. If this latter data is not attainable, then techniques described in Section 2.8.2“Calculating GHG emissions for the customer domain use stage,” (TPCF and PUE factors), can be used to provide an estimated value for this equipment.

      Validation!

    17. Equipment manufacturers may have estimates of TPCFs for their equipment based on defined operating conditions. In all cases, if a TPCF approach is selected, then the basis for selecting the factor should be fullynoted and documented in the GHG inventory report

      How much do these figures fluctuate for cloud boxen?

    18. Allocation of emissions among independent products that share the same process: for example, multiple products sharing the same transport process (vehicle); multiple telecommunication services sharing the same network; multiple cloud services (email, data storage, database applications) sharing the same data center

      k8s makes this a pain, if its designed to co-mingle services on the same boxen

    19. A “screening assessment” is an initial assessment of a product to understand its significant and relevant sources of emissions. This assessment is described in the Product Standard in section 8.3.3.

      Entry level

    20. This chapter provides software developers and architects guidance to benchmark and report the GHG emissions from software use in a consistent manner and make informed choices to reduce greenhouse gas emissions. The chapter is in two parts. Part A provides guidance on the full life cycle assessment of software, while Part B relates specifically to the energy use of software, and covers the three categories of software: operating systems (OS), applications, and virtualization.

      actual formal guidance!

    21. 2015 GeSI published the SMARTer 20308report, extending the analysis out to 2030. This study predicted that the global emissions of the ICT sector will be 1.25 Gt CO2e in 2030 (or 1.97% of global emissions), and emissions avoided through the use of ICT will be 12 Gt CO2e,which is nearly 10 times higher than ICT’s own emissions.

      theres a 2030 report now. I did not know

    22. the total abatement potential from ICT solutions by 2020 is seven times its own emissions.

      increasing confidence in the potental then

    23. Rebound Effects

      🤯

    24. The Product Standard (sections 11.2 and 11.3.2) states that“avoided emissions shall not be deducted from the product’s total inventory results, but may be reported separately.”

      ah, so its not like a magic offset. more transparent. good.

    25. The Product Standarddefines products to be both goods and services, thus for the ICT sector it covers both physical ICT equipment and delivered ICT services. This Sector Guidance, however, focuses more on the assessment of ICT services. In this Sector Guidance the definition of products includes both networks and software as ICT services.

      this makes me think that services like e-commerce or ride sharing might not count on he first read thru

  2. Jan 2019
    1. Big Data is a buzzword which works on the platform of large data volume and aggregated data sets. The data sets can be structured or unstructured.  The data that is kept and stored at a global level keeps on growing so this big data has a big potential.  Your Big Data is generated from every little thing around us all the time. It has changed its way as the people are changing in the organization. New skills are being offered to prepare the new generated power of Big Data. Nowadays the organizations are focusing on new roles, new challenges and creating a new business.  
    1. People, branding your new website require extensive efforts and time for initiating the activities, which will help to build the solid base of your site.  There are so many tasks needed to complete the overall branding as we all know that first impression is your last impression.  Online branding is very important for service providers because as a marketing guy, I know that it would be like stress somewhere the majority of the people are not known about the product. 

      People, branding your new website require extensive efforts and time for initiating the activities, which will help to build the solid base of your site. There are so many tasks needed to complete the overall branding as we all know that first impression is your last impression. Online marketing is very important for service providers because as a marketing guy, I know that it would be like stress somewhere the majority of the people are not known about the product.

    1. Day by Day the amount of data and information are growing over the internet where new sites, new images are coming every second. So with handling this huge amount of data a major challenge was made to extract with relevant daily activities. So to overcome this context Web 3.0 were made and its tool became very valuable for users in an organised form.

      At last, we can conclude that 3.0 will be more connected, open and intelligent with using Web development technologies , distributed databases, machine learning, and also natural language processing.

    1. 2019 is here!! It’s a year which will bring the hope that 2019 will come with new digital Web designs with world’s greatest artists and performers. There are lots of questions in designers mind that what will be the design of 2019. Never rush in to complete the given task, stay focused and try out the new element to get something new every time. Here we have bought few of the web designs techniques that you can use in making the digital Web design.

      2019 is here!! It’s a year which will bring the hope that 2019 will come with new digital Web designs with world’s greatest artists and performers. There are lots of questions in designers mind that what will be the design of 2019. Never rush in to complete the given task, stay focused and try out the new element to get something new every time. Here we have bought few of the web designs techniques that you can use in making the digital responsive Web design.

  3. Dec 2018
    1. As the chief executive of the world’s biggest cement company observed, “we know how to make very low carbon cement – by why would we? There is no incentive.”

      FUCKING HELL

    2. In the growing trade war between China and the US, it seems the world is unwilling even to think about the entirely legitimate use of consumption-based or border carbon pricing either to encourage cleaner production in China, or to deter the Trump administration from using discriminatory trade measures to re-industrialize drawing partly on older and more carbon-intensive technologies.

      How would border carbon pricing work? You pay a tax on the CO2 emissions 'imported'?

    3. The European utilities that tried to ignore the energy transition are now economic zombies; some split their companies in two to try and isolate the assets that have already turned into liabilities in a decarbonizing system.

      E-on as an example?

    4. Averaged over the full 35 years, a constant percentage reduction would require c = - 10%/yr to reach the same end-point – almost impossible at the starting point, but entirely feasible and easily observed in the latter stages of sunset industries.

      Okay, so the argument as I see it so far is that, change, while averaged out might be 3.68 per year, but assuming it's a straight line, is a mistake, as substition of high carbon energy to low carbon looks more like an S shaped curve

    5. their analysis leads both teams to the – only slightly caveated - conclusion that the emission reductions required to the deliver the Paris Aims (“well below 2 deg.C”) are implausible, by almost any standard of macroeconomic evidence – and still more so for the most ambitious “1.5 deg.C” end of the spectrum.

      Ah, so this is a response to the we're doomed papers from before

    1. Electricity Intensity of Internet Data Transmission: Untangling the Estimates

      This is the html version of the PDf I was referring to before.

  4. Nov 2018
    1. Today, we all use many parts of Engelbart’s prescient vision from fifty years ago – while some of the more profound parts remain still unrealized.

      Hmmm... I wonder which of the profound parts are yet unrealized?

    1. We need to learn to see the cumulative impact of a multitude of efforts, while simultaneously keeping all those efforts visible on their own. There exist so many initiatives I think that are great examples of how distributed digitalisation leads to transformation, but they are largely invisible outside their own context, and also not widely networked and connected enough to reach their own full potential. They are valuable on their own, but would be even more valuable to themselves and others when federated, but the federation part is mostly missing. We need to find a better way to see the big picture, while also seeing all pixels it consists of. A macroscope, a distributed digital transformation macroscope.

      This seems to be a related problem to the discovery questions that Kicks Condor and Brad Enslen have been thing about.

    1. Learning needs analysis of collaborative e-classes in semi-formal settings: The REVIT exampl

      This article explores the importance of analysis of instructional design which seems to be often downplayed particularly in distance learning. ADDIE, REVIT have been considered when evaluating whether the training was meaningful or not and from that a central report was extracted and may prove useful in the development of similar e-learning situations for adult learning.

      RATING: 4/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

    1. List of web 2.0 applications

      EDUTECH wiki is a site that contains a variety of links to lists to hep educators with web 2.0 applications improving productivity Caution: some of the links are not active!

      RATING: 4/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

    1. This means that software that deals with Internet must be actively maintained. If it is not it will become more and more useless in practice over time, however much it remains theoretically correct, not because it has bugs or security holes as such but because the environment it was designed to work in no longer exists and thus the assumptions it was built on are now incorrect.

      internet software decays

    1. Using Model Strategies forIntegrating Technology into Teaching

      In this pdf, there are many helpful tips and techniques in creating a foundation for technology. The introduction of model strategies are laid out with lots of supporting detail and examples and weblinks. It includes nearly 400 pages of peer-reviewed lessons, models and various strategies.

      RATING: 5/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

  5. Oct 2018
    1. Es ist aber auch Widerstand gegen den nicht ebenso leicht zu beschreibenden, weil sich liberal gebenden Totalitarismus des WWW selbst zu leisten.
    1. Why do people troll? Eight factors are given, which might boil down to:

      • Perceived lack of consequences.
      • Online mob mentality.
    1. Do neural networks dream of semantics?

      Neural networks in visual analysis, linguistics Knowledge graph applications

      1. Data integration,
      2. Visualization
      3. Exploratory search
      4. Question answering

      Future goals: neuro-symbolic integration (symbolic reasoning and machine learning)

    1. React is fast, thanks to the VirtualDOM. Using a diffing algorithm, the browser DOM nodes are manipulated only when there is a state change. This algorithm is computationally expensive. Using webworkers to perform the calculations can make React even faster.
    1. The Online Disinhibition Effect (John Suler, 2004) - the lack of restraint shown by some people when communicating online rather than in person. (It can be good as well as bad. How can we reduce the bad behavior?)

      https://en.wikipedia.org/wiki/Online_disinhibition_effect http://truecenterpublishing.com/psycyber/disinhibit.html

    1. Video streaming service Netflix is the world's most data-hungry application, consuming 15% of global net traffic, according to research from bandwidth management company Sandvine.

      Ah, there's new sandvine report for 2018

    1. Intelligent agents the vision revisited

      Memex, 1945 (for storing individual memories) License + societal norms + interoperability

    1. Learning Expressive Ontological Concept Descriptions via Neural NetworksMARCO ROSPOCHERTheRoadLessTraveledTransforming a sentence into an axiom

      Building ontology from text: transforming a sentence into an axiom.

    1. We will solve large analytical problems by turning computer power loose on the hard data of the Semantic Web.

      The idea of turning something loose has the connotation that it is no longer under our control and can therefore have unpredictable outcomes. To a degree, no one can really predict what would happen if we reprogrammed the web in this new way.

  6. Sep 2018
    1. code for transforming Annotator JSON into Web Annotation's JSON-LD with the most minimal, unsmart means (read: doesn't understand graphs) sort of way possible.
    1. All of these platforms are different and they focus on different needs. And yet, the foundation is all the same: people subscribing to receive posts from other people. And so, they are all compatible. From within Mastodon, Pleroma, Misskey, PixelFed and PeerTube users can be followed and interacted with all the same.
    1. ActivityPub is a decentralized social networking protocol based on the ActivityStreams 2.0 data format. ActivityPub is an official W3C recommended standard published by the W3C Social Web Working Group. It provides a client to server API for creating, updating and deleting content, as well as a federated server to server API for delivering notifications and subscribing to content.
    1. A URI identifies a resource either by location, or a name, or both. A URI has two specializations known as URL and URN. A Uniform Resource Locator (URL) is a subset of the Uniform Resource Identifier (URI) that specifies where an identified resource is available and the mechanism for retrieving it.URL defines how the resource can be obtained. It does not have to be HTTP URL (http://), a URL can also be (ftp://) or (smb://) A Uniform Resource Name (URN) is a Uniform Resource Identifier (URI) that uses the URN scheme, and does not imply availability of the identified resource. Both URNs (names) and URLs (locators) are URIs, and a particular URI may be both a name and a locator at the same time.
    1. The WARC (Web ARChive) file format offers a convention for concatenating multiple resource records (data objects), each consisting of a set of simple text headers and an arbitrary data block into one long file. The WARC format is an extension of the ARC file format (ARC) that has traditionally been used to store “web crawls” as sequences of content blocks harvested from the World Wide Web. Each capture in an ARC file is preceded by a one-line header that very briefly describes the harvested content and its length. This is directly followed by the retrieval protocol response messages and content.
    1. A peer-to-peer hypermedia protocol to make the web faster, safer, and more open.
  7. Jul 2018
    1. 理论上 Calibre-web 会自动新建一个数据库。如果不能新建或者报错,你需要在桌面版的 Calibre 里,在电脑上新建一个空白书库,然后把该目录下的 metadata.db 数据库文件,复制一份到 /books/calibre 目录下,这样应该不会出现问题

      自己部署的时候确实会出现新建空白metadata.db出错的问题,不能成功新建DB,所以要利用桌面版新建一个空白db,拷贝到目录下

    1. Spider Web Discussion is an adaptation of the Socratic seminar in that it puts students squarely in the center of the learning process, with the teacher as a silent observer and recorder of what s/he sees students saying and doing during the discussion. Her method is used when the teacher wants students to collaboratively discuss and make meaning of a particular learning concept

      Spider web discussions for collaborative learning

    1. Alan poses a question in his TEDx talk that we should ask students: “Do you know how to use Google?” Of greater importance, the same question should be asked of teachers.

      Video: Alan November TEDx talk "Do you know how to use google?" We need web literacy for teachers as well

    1. The Teaching Tolerance Digital Literacy Framework offers seven key areas in which students need support developing digital and civic literacy skills. The numbered items represent the overarching knowledge and skills that make up the framework. The bullets represent more granular examples of student behaviors to help educators evaluate mastery.

      Digital Literacy Framework of Points

    1. When it comes to democracy and human rights, a Jeffersonian internet is clearly a safer choice. With Web 3.0 still in its infancy, the West at least will need to find other ways to rein in the online giants. The obvious alternative is regulation.
  8. Jun 2018
    1. Text is not going away, but if we really want it to be understood and remembered, we should integrate it better with physical and emotional experience. This convergence may happen with the “physicalization” of the digital world, where digital experiences become part of our physical life.

      Como nos filmes, será que um dia a Web vai extrapolar o digital e ir para o mundo físico?

  9. May 2018
    1. On the stage, a very adept and confident speaker jokingly mentioned a web-related joke that feels decades old (which she was sarcastically referring to as being decades old) and the whole room fell about laughing; it was the first time they had heard this reference. It was in that moment I realised the web industry had changed as we knew it. I looked at the other speakers and they too, had a similar look of realisation on their faces.
  10. Apr 2018
    1. "The problem: the automated web browsing tools they want to use (commonly called “web scrapers”) are prohibited by the targeted websites’ terms of service, and the CFAA has been interpreted by some courts as making violations of terms of service a crime."

    2. Good news for anyone who uses the Internet as a source of information: A district court in Washington, D.C. has ruled that using automated tools to access publicly available information on the open web is not a computer crime
    1. Pingback: Legality of Extracting Publicly Available User-Generated Content – PromptCloud Pingback: How to Scrape Facebook Posts for Free Content Ideas Pingback: Facebook data harvesting—what you need to know (From Phys.org) – Peter Schwartz

      important readings

    2. This is an extremely important case to remember. It has implications for all Fb users who want to own their past.

    1. Need proof? In Linkedin v. Doe Defendants, Linkedin is suing between 1-100 people who anonymously scraped their website. And for what reasons are they suing those people? Let's see: Violation of the Computer Fraud and Abuse Act (CFAA). Violation of California Penal Code. Violation of the Digital Millennium Copyright Act (DMCA). Breach of contract. Trespass. Misappropriation.

      Linkedin lawsuit -- terrifying

  11. Mar 2018
    1. We worked with the industry to launch the open-source Accelerated Mobile Pages Project to improve the mobile web

      There was some collaborative outreach, but AMP is really a Google-driven spec without significant outside input.

      See also: http://ampletter.org/

    2. With AMP Stories, which is now in beta, publishers can combine the speed of AMP with the rich, immersive storytelling of the open web.

      "With AMP Stories, which is now in beta, publishers can combine the speed of AMP with the rich, immersive storytelling of the open web."

      Is this sentence's structure explicitly saying that AMP is not "open web"?!

    1. avoid digital redlining,[26] creating inequities (however unintentionally) through the use of technology.

      So many challenges here, and we really must address all of them. I'm also interested in learning how to make sure my websites and other affordances I use are accessible to people with disabilities.

    1. Tim Berners-Lee offers some broad suggestions for improving the web.

      expand access to the world’s poorest through public access solutions, such as community networks and public WiFi initiatives.

      . . .

      A legal or regulatory framework that accounts for social objectives

      Because we can't count on Google, Facebook, etc. to act in the public interest on their own initiative.

      . . .

      Two myths currently limit our collective imagination: the myth that advertising is the only possible business model for online companies, and the myth that it’s too late to change the way platforms operate. On both points, we need to be a little more creative.

      . . .

      Let’s assemble the brightest minds from business, technology, government, civil society, the arts and academia to tackle the threats to the web’s future.

    1. German credit agency to mine Facebook

      Aquí resulta interesante la autonomía digital y la adopción de la web independiente: no es sólo por tener cierto control sobre los datos producidos, sino por los terceros que puedan sacar provecho de estas plataformas.

    2. Examples of algorithms by function

      Interesante clasificación de tipologías de sitios web, de acuerdo a sus funciones.

  12. Feb 2018
    1. AIS Technolabs is an IT consulting company which provides IT services to the clients all over the globe. It is the motto of the company to work for their clients and enhance their trade success by means of their services. Our company has been established 5 years back. We have been offering our services for nearly more than a half decade and are totally aware of all types of areas in which IT solutions can be provided to our customers. We have been working with industries from diverse sectors of a different magnitude from the start-ups to the colossal organizations.

  13. Jan 2018
    1. or OR

      Suggestion: avoid possible confusion between the boolean operator OR and the example state code.

    1. Similarly, the Club is not aiming for visibility at any price; which can be seen in the fact that it does not make use of Facebook or many other capital oriented and data hungry infrastructures.

      En el caso de HackBo, tenemos presencia en Facebook y Twitter, pero no es muy activa y no ha sido estratégicamente usada para atraer a los visitantes a nuestras propias infraestructuras, que además no han madurado apropiadamente y difícilmente podrían facilitar la migración de otras personas a ellas o variantes alineadas con la Indie Web.

  14. Dec 2017
    1. Similarly, the desire to communicate and collaborate and to coordinate activities within and beyond the Club’s boundaries through decentral-ized infrastructures was the driving force behind the hackers’ efforts to establish these networks
    1. Now reload the usual URL in your browser and repeat the above procedure by modifying the message to be printed in the browser console. As before you'll see that as soon as you save the core.cljs file the CLJS recompilation is triggered. This time, thanks to the boot-reload task, the page is reloaded as well. You can confirm this by seeing if the new message is printed in the browser's console.

      Remember to properly reload your browser. I first tried a normal F5 refresh. My browser didn't reload all the javascript files, which didn't give me the right websocket-port. Refreshing with Ctrl+Shift+R fixed this for me (Vivaldi 1.13).

    1. 6. It should be possible to further qualify a reference to a "sublocation" within an object (which would have meaning only to the server that houses it). This is needed, for example, for hypertext-type links. Such a sublocation might be the 25th paragraph of a text, for a hypertext-type pointer.
  15. Nov 2017
    1. Back in 1993, when Eric Bina and I were first building Mosaic, it seemed obvious to us that users would want to annotate all text on the web – our idea was that each web page would be a launchpad for insight and debate about its own contents. So we built a feature called "group annotations" right into the browser – and it worked great – all users could comment on any page and discussions quickly ensued. Unfortunately, our implementation at that time required a server to host all the annotations, and we didn't have the time to properly build that server, which would obviously have had to scale to enormous size. And so we dropped the entire feature.
    1. Kontrollü navigasyon

      Bu kavram ilk defa bu yazıda kullanıldı. Web içeriği ile sosyal medyada paylaşılan içeriklerin görüntüleme davranışını açıklayan bir kavram.

    1. basic Web 2.0 premises of aggregation, openness, tagging, portability, reuse, multichannel distribution, syndication, and user-as-contributor
    2. recent promise of Web 2.0

      A bit surprised by this “recent”. By that time, much of what has been lumped under the “Web 2.0” umbrella had already shifted a few times. In fact, the “Web 3.0” hype cycle was probably in the “Trough of Disillusionment” if not the Gartner-called “Slope of Enlightenment”.

    1. An institution has implemented a learning management system (LMS). The LMS contains a learning object repository (LOR) that in some aspects is populated by all users across the world  who use the same LMS.  Each user is able to align his/her learning objects to the academic standards appropriate to that jurisdiction. Using CASE 1.0, the LMS is able to present the same learning objects to users in other jurisdictions while displaying the academic standards alignment for the other jurisdictions (associations).

      Sounds like part of the problem Vitrine technologie-éducation has been tackling with Ceres, a Learning Object Repository with a Semantic core.

    1. Figure 4: Typical diurnal cycle for traffi c in the Internet. The scale on the vertical axis is the percentage of total users of the service that are on-line at the time indicated on the horizontal axis. (Source: [21])

      I can't see an easy way to link to this graph itself, but this reference should make it easier to get to this image in future

    1. The original vision for the Web according to its creator, Tim Berners-Lee, was a space with multilateral publishing and consumption of information. It was a peer-to-peer vision with no dependency on a single party. Tim himself claims the Web is dying: the Web he wanted and the Web he got are no longer the same.
  16. Oct 2017
    1. What is this again? What Google Drive should be. What Dropbox should be. What file systems can be. The way we unify our data access across companies, services, programs, and people. The way I want to live and work.

      I think that this is interesting, but idealistic. The code repo on GitHub is quite active, but how does a technology like this gain traction?

    1. One must also be able to annotate links, as well as nodes, privately.

      Tim Berners-Lee calls for annotation in his original proposal for the web.

    1. using the style tag and writing the CSS inside it or by using the link tag to link to a style sheet. Either of these tags go in the head portion of your HTML. 

      How to include CSS in a page/site. Goes in the Head

      1. Use <style> tag or</li> <li>Use <link> tag that points to a style sheet.</li> </ol> </style>
    2. One of those themes was reusability. You could describe a style once in CSS and reuse it across multiple elements or even multiple web pages. 0:31Another of those themes was maintainability. Being able to efficiently change your web page in response to changing design requirements. 

      Why CSS

      1. Reusability
      2. Maintainability
    1. And outside the classroom, meetings with public oicials, nonprofits, and other community members, where students are given a chance to present their findings and recommendations on an issue they’ve researched

      Public annotation of government documents/websites, newspaper articles, etc.

  17. Sep 2017
    1. Signposting is an approach to make the scholarly web more friendly to machines. It uses Typed Links as a means to clarify patterns that occur repeatedly in scholarly portals. For resources of any media type, these typed links are provided in HTTP Link headers. For HTML resources, they are additionally provided in HTML link elements. Throughout this site, examples use the former approach.

      A kind of light-weight linked data approach to connecting web pages?

  18. Aug 2017
    1. The Web We Need to Give Students

      The title itself is expressive towards the fact that the educational system has been trying to come up with many ways to help students manage their understanding of the web in general.

      some 170 bills proposed so far ...

      Its no surprise tha tthe schools can share data with companies and researchers for their own benefits. Some of these actions are violations of privacy laws.

      arguments that restrictions on data might hinder research or the development of learning analytics or data-driven educational software.

      Unbelievable! The fact that there is actually a problem with the fact that students or anyone wants their privacy, but abusing companies and businesses can't handle invading others privacies is shocking. It seems to be a threat to have some privacy.

      Is it crazy that this reminds me of how the government wants to control the human minds?

      All the proof is there with telephone records, where the NSA breaches computers and cellphones of the public in order to see who they communicate with.

      Countries like Ethiopia; the government controls what the people view on their TV screens. They have complete control of the internet and everything is vetted. Privacy laws has passed! Regardless, no one is safe. For example: Hackers have had access to celebrity iCloud accounts, and exposed everything.

      The Domain of One’s Own initiative

      Does it really protect our identities?

      Tumblr?

      Virginia Woolf in 1929 famously demanded in A Room of One’s Own — the necessity of a personal place to write.

      Great analogy! Comparing how sometimes people need to be in a room all on their own in order to clear their minds and focus on their thoughts on paper to also how they express themselves in the web is a good analogy.

      ... the Domains initiative provides students and faculty with their own Web domain.

      So, the schools are promising complete privacy?

      ...the domain and all its content are the student’s to take with them.

      Sounds good!

      Cyberinfrastructure

      To be able to be oneself is great. Most people feel as if their best selves are expressed online rather than real life face-to-face interactions.

      Tumblr is a great example. Each page is unique to ones own self. That is what Tumblr sells, your own domain.

      Digital Portfolio

      Everyone is different. Sounds exciting to see what my domain would look like.

      High school...

      Kids under 13 already have iPhones, iPads, tablets and laptops. They are very aware to the technology world at a very young age. This domain would most likely help them control what they showcase online, before they grow older. Leaving a trail of good data would benefit them in the future.

      Digital citizenship:

      It teaches students and instructors how to use technology the right way.

      What is appropriate, and what is not appropriate?

      Seldom incluse students' input...

      Students already developed rich social lives.

      Google doc= easy access to share ones work.

      Leaving data trails behind.

      Understanding options on changes made?

      Being educated on what your privacy options are on the internet is a good way of protecting your work.

      Student own their own domain- learning portfolio can travel with them.

      If the students started using this new domain earlier in their lives, there should be less problems in schools coming up with positive research when it comes to the growth of the students on their data usages.

      School district IT is not the right steward for student work: the student is.

      So to my understanding, if the student is in the school, one has to remember to move around the files saved in the domain. The school is not responsible for any data lost, because the student is responsible for all their work.

      Much better position to control their work...

      If all of this is true and valid, it should not be a big deal then for the student to post what ever they want on their domain. No matter how extreme, and excessive it seems, if that is how they view themselves, their domain would be as unique as their personalities.

  19. Jul 2017
    1. The Internet is this generation’s defining technology for literacy and learning within our global community.2.The Internet and related technologies require new literacies to fully access their potential.

      Completely agree with this statement!

    2. The new literacies of online research and comprehension frames online reading comprehension as a process of problem-based inquiry involving the skills, strategies, dispositions, and social practices that take place as we use the Internet to conduct research, solve problems, and answer ques-tions.

      This is an essential part of PBL, internet research is the essential skill students need to be able to obtain information and analyze their findings.

    3. How can we develop adequate understanding when the very object that we seek to study continuously changes?

      This can be seen as a problem or as an advantage, information is always changing, ideas are been created and developed. Students and teachers do not need to wait for books to print materials to be accessible, is right there.. one click away, now how we find and analyze information on the web is the tool our students need to become web literate .

    4. Consider, for example, just a few of these new technologies: Twitter, Facebook, Google+, Siri, Foursquare, Drop-box, Skype, Chrome, iMovie, Contribute, or any of many, many mobile “apps” and ebooks

      students using this sites need to have web-literacy skills to obtain accurate and relevant information.

    1. It is the responsibility of educators in all grades and content areas to modify as needed for learners.

      Educators guiding these students should have the necessary skills to effectively modify the route of the inquiry. some may argue that pre-k students are too young for this projects, but with the right guidance even little ones can benefit from it.

  20. www.literacyandtechnology.org www.literacyandtechnology.org
    1. TPACK What knowledge do teachers need in order to facilitate student research? Understanding complex relationships among technology, pedagogy, and content with models like the TPACK framework may facilitate teacher growth in new literacies

      TPACK and web-literacy has been proven to help student to deductive evaluate, organize and synthesizing information effectively

    1. when students share what they have learned not only about the information they found, but the sources and strategies they used to uncover that information.

      higher level of thinking! this skills will help students to become leaders rather to recall information. In the end of the day, everyone can search and find imformation at anytime, but can they find the "right" information?

    2. 1

      when directing students to google searchs, is important to guide our students to "get their web literacy hat on" this means to use their reading strategies to ensure the information is valid, important and related to our search.

    1. Wikipedia is broadly misunderstood by faculty and students alike. While Wikipedia must be approached with caution, especially with articles that are covering contentious subjects or evolving events, it is often the best source to get a consensus viewpoint on a subject. Because the Wikipedia community has strict rules about sourcing facts to reliable sources, and because authors must adopt a neutral point of view, articles are often the best available introduction to a subject on the web.

      using Wikipedia as a source of information

    1. The habit is simple. When you feel strong emotion — happiness, anger, pride, vindication — and that emotion pushes you to share a “fact” with others, STOP. Above all, it’s these things that you must fact-check. Why? Because you’re already likely to check things you know are important to get right, and you’re predisposed to analyze things that put you an intellectual frame of mind. But things that make you angry or overjoyed, well… our record as humans are not good with these things. As an example, we might cite this tweet which recently crossed my Twitter feed: You don’t need to know that much of the background here to see the emotionally charged nature of this. President Trump had insulted Chuck Schumer, a Democratic Senator from New York, saying tears that Schumer shed during a statement about refugees were “fake tears”.  This tweet reminds us that that Senator Schumer’s great grandmother died at the hands of the Nazis, which could explain Schumer’s emotional connection to the issue of refugees. Or does it? Do we actually know that Schumer’s great-grandmother died at the hands of the Nazis? And if we are not sure this is true, should we really be retweeting it?

      Example of importance of fact-check. How to spy lies based on a truthful story.

    1. Check for previous work: Look around to see if someone else has already fact-checked the claim or provided a synthesis of research. Go upstream to the source: Go “upstream” to the source of the claim. Most web content is not original. Get to the original source to understand the trustworthiness of the information. Read laterally: Read laterally.[1] Once you get to the source of a claim, read what other people say about the source (publication, author, etc.). The truth is in the network. Circle back: If you get lost, or hit dead ends, or find yourself going down an increasingly confusing rabbit hole, back up and start over knowing what you know now. You’re likely to take a more informed path with different search terms and better decisions.

      Some ideas for checking Facts in the web

    1. 2. Staying with a closed, proprietary system & not moving to the adoption of open standards.

      I concur. Diigo could have been a leader in the social annotation space, way ahead of Hypothesis. But now I think H is gaining more momentum than Diigo, because it adheres to open standards.

    1. generate fake FCC filings, or advance their big government agenda.

      Most evidence I've seen online indicates that there's been a fair amount of fake filings from everyone, with the majority of spam likely coming from the "against" side.

      This is (one of the reasons) why it's better to do controlled studies rather than asking people to voluntarily submit their own opinions. Most of the studies I have seen suggest that both Republicans and Democrats broadly support a data agnostic Internet.

    2. Under these regulations, government bureaucrats can decide what websites they can prioritize or punish and what broadband infrastructure investments are worth.

      That is quite literally the opposite of what Network Neutrality does. A common carrier, by definition, does not prioritize or punish any content.

      Net Neutrality advocates want the exact same thing you do - an Internet where no one, even the government, can arbitrarily decide that one website or service gets an artificial competitive advantage over another.

    1. When he read the Web address, http://pubweb.northwestern.edu/~abutz/di/intro.html, he assumed that the domain name “northwestern.edu” automatically meant it was a credible source. He did not understand that the “~” character, inserted after the domain name, should be read as a personal Web page and not an official document of the university.

      Even though I consider myself web literate enough to tell the difference between a personal and academic page, I honestly didn't know that the "~" denoted that. I really need to get better about thinking of web addresses and code as a language (which they are).

    1. An open letter from Tim Wu to Tim Berners-Lee, urging caution regarding a proposed DRM standard for the Web (Encrypted Media Extensions), and the possible abuse of anti-circumvention laws.

    1. “The only way to save a democracy is to explain the way things work,” says Linus Neumann, a CCC spokesman and information security consultant. “Understanding things is a good immunization.”

      democracy and web literacy

  21. Jun 2017
    1. The whole point of the newly-minted web annotation standard is to enable an ecosystem of interoperable annotation clients and servers, analogous to comparable ecosystems of email and web clients and servers.

      I think one of the ideas I'm struggling with here. Is web annotation just about research, or to advance conversation on the web? I sense this is part of decentralization too (thus, an ecosystem), but where does it fit?

    1. Quoting Media Theorist & philosopher Wolfgang Ernst on his concept of processual memory: “The web provides immediate feedback, turning all present data into archival entries and archival entries into data – a dynamic agency, with no delay between memory and the present. Archive and memory become meta-phorical; a function of transfer pro-cesses.”, which Ernst describes as an economy of circulation – permanent transformations and updating. There are no places of memory, Ernst states, there are simply urls. In other words; digital memory is built from its archi-tecture, it is embedded in the network and constituted from how it links from one to another.

      there are no places of memory, there are simply urls.

    1. You might think it’s hyperbole for Winer to say that Facebook is trying to kill the open web. But they are. I complain about Google AMP, but AMP is just a dangerous step toward a Google-owned walled garden — Facebook is designed from the ground up as an all-out attack on the open web.

  22. May 2017
    1. The web was supposed to open up higher ed. In a model like Antigonish 2.0, higher ed may be the lever needed to reopen the web to its participatory, democratic potential.
    1. Postmodernism requires human cognitive mapping; digital media require the orienting capacities of the human sensorimotor body

      What if we think of this in the way that a Wikipedia article is built, the web-like structure of hyperlinks...

  23. Apr 2017