130 Matching Annotations
  1. Jan 2021
    1. Prior to the adoption of the Cross-Origin Resource Sharing (CORS) standard, JSONP was the only option to get a JSON response from a server of a different origin.
  2. Dec 2020
    1. By understanding the definition of cloud computing, you must have guessed why healthcare institutes would be interested in upgrading their data management systems with this trending technology.

      Cloud computing in healthcare is bringing many big changes to the medical industry. Make sure you don't fall behind on this revolution.

  3. Nov 2020
    1. Portable... your .name address works with any email or web service. With our automatic forwarding service on third level domains, you can change email accounts, your ISP, or your job without changing your email address. Any mail sent to your .name address arrives in any email box you choose.
    1. In July 2010, Microsoft let go Jimmy Schementi, one of two remaining members of the IronRuby core team, and stopped funding the project.[19][20] In October 2010 Microsoft announced the Iron projects (IronRuby and IronPython) were being changed to "external" projects and enabling "community members to make contributions without Microsoft's involvement or sponsorship by a Microsoft employee".
    1. will only apply up the chain

      Should this "up the chain" be "down the chain"?

      In terms of a tree, I think of the caller/consumer/thing that imports this file as "up" and the things that I call/import as "down".

      That is more consistent with a tree, but not a stack trace (or any stack), I suppose, which has most recently called thing at the top ("up"), and the consumer of that at the bottom ("down").

    1. React didn't support rendering arrays without a wrapper for most of its existence 16.0.0 (September 26, 2017): Components can now return arrays and strings from render.
  4. Oct 2020
    1. Final Form makes the assumption that your validation functions are "pure" or "idempotent", i.e. will always return the same result when given the same values.
    2. Final Form makes the assumption that your validation functions are "pure" or "idempotent", i.e. will always return the same result when given the same values. This is why it doesn't run the synchronous validation again (just to double check) before allowing the submission: because it's already stored the results of the last time it ran it.
    1. In agent-oriented programming the antonym is depender, though in general usage the common term dependent is used instead. There is no common language equivalent for dependee', however – other metaphors are used instead, such as parent/child. The circumlocutions “A depends on B” and “B is depended on by A” are much more common in general use than “A is the depender, B is the ' dependee ”.
    1. I think it is still problematic since many people in the software industry use and understand "dependency" to mean the thing on which something depends (as indicated by this and other answers). So saying "being a dependency" indicates to those people the thing on which something depends, which is the opposite of the way I think of it (and what it means according to the dictionary).
    1. There may be times that required owned elements are missing, for example, while editing or while loading a data set. When a widget is missing required owned elements due to script execution or loading, authors MUST mark a containing element with aria-busy equal to true. For example, until a page is fully initialized and complete, an author could mark the document element as busy.

      "busy" here seems to = "loading" in most other programming contexts

    1. Longstanding controversy surrounds the meaning of the term "hacker". In this controversy, computer programmers reclaim the term hacker, arguing that it refers simply to someone with an advanced understanding of computers and computer networks[5] and that cracker is the more appropriate term for those who break into computers, whether computer criminals (black hats) or computer security experts (white hats).
    1. In React 0.12 time frame we did a bunch of small changes to how key, ref and defaultProps works. Particularly, they get resolved early on in the React.createElement(...) call. This made sense when everything was classes, but since then, we've introduced function components. Hooks have also make function components more prevalent. It might be time to reevaluate some of those designs to simplify things (at least for function components).
    1. Node doesn't have a DOM available. So in order to render HTML we use string concatenation instead. This has the fun benefit of being quite efficient, which in turn means it's great for server rendering!
    1. Facebook’s React has an optional language extension that enables you to embed HTML inside JavaScript. This extension can make your code more concise, but it also breaks compatibility with the rest of the JavaScript ecosystem. ECMAScript 6 will have template strings [1], which enable you to implement JSX (or something close to it) inside the language.
    1. But is overhead always bad? I believe no — otherwise Svelte maintainers would have to write their compiler in Rust or C, because garbage collector is a single biggest overhead of JavaScript.
    1. I don't understand the need for the name "Open–closed principle". It doesn't seem meaningful or clear to me.

      Can't we just call it "extensibility" or "easily extendable"? Doesn't "extensibility" already imply that we are extending it (adding new code on top of it, to interoperate with it) rather than modifying its source code?

    1. State changes flow from the roots of this graph (which we call atoms) through pure functions (which we call selectors) and into components.
    1. The misspelling of referrer originated in the original proposal by computer scientist Phillip Hallam-Baker to incorporate the field into the HTTP specification.[4] The misspelling was set in stone by the time of its incorporation into the Request for Comments standards document RFC 1945; document co-author Roy Fielding has remarked that neither "referrer" nor the misspelling "referer" were recognized by the standard Unix spell checker of the period.
  5. Sep 2020
    1. detach, as an api, should be declarative (ensure the node is detached) instead of imperative (detach the node), allowing it to be called multiple times by performing a noop if the node is already detached. This way, it won't matter if the node is removed from the DOM from outside of svelte.
    1. If you're using webpack with svelte-loader, make sure that you add "svelte" to resolve.mainFields in your webpack config. This ensures that webpack imports the uncompiled component (src/index.html) rather than the compiled version (index.mjs) — this is more efficient.
    1. Why do we use bundlers again?Historically, bundlers have been used in order to support CommonJS files in the browser, by concatenating them all into a single file. Bundlers detected usages of require() and module.exports and wrap them all with a lightweight CommonJS runtime. Other benefits were allowing you to serve your app as a single file, rather than having the user download several scripts which can be more time consuming.
    1. The appeal of social networks is partly because they let us create documents without thinking about web technology,

      mirrors strongly another comment i made, that our appetites & expectations for computing has outstripped the personal, that we now expect computing to be connective. we want the digital matter we create to exist not just locally, but widely. https://hypothes.is/a/11-k1v7pEeqJ1qdf5kJahQ

    1. you may specify only the form state that you care about for rendering your gorgeous UI. You can think of it a little like GraphQL's feature of only fetching the data your component needs to render, and nothing else.
    1. It was called a "virtual DOM" library because it didn't start out as isomorphic, but actually tied to the DOM from the start. It was an afterthought to make it isomorphic.
  6. Aug 2020
    1. Let us quickly travel back in time to 2016. SWOOSH! We are there. JavaScript landscape looks like this: If you are using a JavaScript framework or want to use a framework, Angular.js is probably something you would choose. But, the news about Angular 2 that will make you rewrite almost everything is just around the corner. Also, this new kid on the block - React.js is coming up and getting ripe. Of course, Vanilla JS and no-framework-folks are there. Not using a framework is still a popular opinion in 2016, but is slowly fading.
  7. Jul 2020
    1. ruby-prof supports excluding specific methods and threads from profiling results. This is useful for reducing connectivity in the call graph, making it easier to identify the source of performance problems when using a graph printer. For example, consider Integer#times: it's hardly ever useful to know how much time is spent in the method itself. We are more interested in how much the passed in block contributes to the time spent in the method which contains the Integer#times call. The effect on collected metrics are identical to eliminating methods from the profiling result in a post process step.
    2. ruby-prof provides two options to specify which threads should be profiled: exclude_threads:: Array of threads which should not be profiled. include_threads:: Array of threads which should be profiled. All other threads will be ignored.
  8. Jun 2020
    1. If those comments are loaded outside of the blog_post association, then attempting to reference the blog_post association from within each comment will result in N blog_posts table queries even if they all belong to the same BlogPost!
    1. In systems engineering and requirements engineering, a non-functional requirement (NFR) is a requirement that specifies criteria that can be used to judge the operation of a system, rather than specific behaviors. They are contrasted with functional requirements that define specific behavior or functions

      This is a strange term because one might read "non-functional" and interpret in the sense of the word that means "does not function", when instead the intended sense is "not related to function". Seems like a somewhat unfortunate name for this concept. A less ambiguous term could have been picked instead, but I don't know what that would be.

  9. May 2020
    1. Also known as "serverless", "client-side", or "static" web apps, unhosted web apps do not send your user data to their server. Either you connect your own server at runtime, or your data stays within the browser.

      serverless has another meaning (that does actually use a server) so I prefer the term "unhosted" since it has no such ambiguity.

      See also:

    1. The Journal was a primitive hypertext-based groupware program, which can be seen as a predecessor (if not the direct ancestor) of all contemporary server software that supports collaborative document creation (like wikis). It was used by ARC members to discuss, debate, and refine concepts in the same way that wikis are being used today.
    1. quantum blockchain

      Do they really use a quantum blockchain? What exactly do they mean by that? Probably just a buzzword they're using to attract interest but aren't actually meaning literally.

    1. A quantum blockchain, the pair suggests, would take advantage of entanglement, which in most cases, applies to situations regarding space. But it could also be useful for situations involving time, such as blockchains. In such a blockchain, the pair explains, transaction records could be represented by pairs of entangled photons linked in chronological order. When transfers take place, photons would be created and absorbed by the hubs that comprise a network. But since entangled photons are linked across time, they can be caused to have never existed at the same time.
    1. The folks at Netlify created Netlify CMS to fill a gap in the static site generation pipeline. There were some great proprietary headless CMS options, but no real contenders that were open source and extensible—that could turn into a community-built ecosystem like WordPress or Drupal. For that reason, Netlify CMS is made to be community-driven, and has never been locked to the Netlify platform (despite the name).

      Kind of an unfortunate name...

  10. Apr 2020
    1. Just a subtle clarification here. Safe means no side-effects. Idempotent means the same side effect no matter how many time a service is called. All safe services are inherently idempotent because there are no side effects. Calling GET on a current-time resource multiple times would return a different result each time, but it's safe (and thus idempotent).
    1. In math, idempotence describes only unary functions that you can call on their own output. Math-idempotence is, “If you take the absolute value of a number, and then you take the absolute value of that, the result doesn’t change on the second (or subsequent) operations.” Math.abs is math-idempotent. Math-idempotence only applies to functions of one parameter where the parameter type and return type are the same. Not so useful in programming.
    2. Programming-idempotence is about side effects. It’s about stuff that happens to the outside world when you call a function. Idempotence says “If you’ve called me once, it doesn’t matter whether you called me again.”
    1. I am increasingly concerned when I hear my colleagues refer to themselves with computer metaphors—“I don’t have the bandwidth,” “I have to boot up,” or “I need to recharge.”
    1. In the early 1990s, the creators of Netscape apparently built a function that enabled each web page to be annotated by those visiting it, as a way for viewers to discuss the page’s content. But according to a [1] produced in 2013 by a nonprofit called [Hypothesis][2], the feature was turned off.
    1. for-profit tech companies — most notably Google, Apple, Facebook, and Amazon (GAFA) — built software and services that rapidly outpaced the capabilities of open protocols
    2. Huge web properties were started during this era including Yahoo, Google, Amazon, Facebook, LinkedIn, and YouTube. In the process, the importance of centralized platforms like AOL greatly diminished.
  11. Feb 2020
  12. Jan 2020
    1. ​=(α∣0⟩+β∣1⟩)(γ∣0⟩+δ∣1⟩)=αγ∣00⟩+αδ∣01⟩+βγ∣10⟩+βδ∣11⟩.​

      Might be the answer to an above inquiry.

    2. we apply a Hadamard gate

      What is the method to evaluate whether the output of a Hadamard gate should invert the bottom qubit or not?

      is (0 + 1) / sqrt 2 high or low?

      I'm missing something fundamental here.

    3. equal

      Frustrating wording here for me... Why is the word "equal" here at all. Doesn't seem to clarify anything.

    4. the

      For Computer Scientists, Microsoft put together a primer to Quantum Computing for us here: https://www.youtube.com/watch?v=F_Riqjdh2oM

      I could understand some of it (through 40m), but think this series of articles will help immensely and I'll return to it after.

    5. What does it mean for a matrix UUU to be unitary? It’s easiest to answer this question algebraically, where it simply means that U†U=IU^\dagger U = IU†U=I, that is, the adjoint of UUU, denoted U†U^\daggerU†, times UUU, is equal to the identity matrix. That adjoint is, recall, the complex transpose of UUU:

      Starting to get a little bit more into linear algebra / complex numbers. I'd like to see this happen more gradually as I haven't used any of this since college.

  13. Dec 2019
  14. Nov 2019
    1. The language for writing React. Reason's creator also created ReactJS, whose first prototypes were written in SML, a distant cousin of OCaml. We've transcribed ReactML into ReactJS for wide adoption. A few years later, we're now iterating on the future of ReactJS through ReasonReact.
    1. the main reason we built a new multiprocess architecture is that Chromium's multiprocess support was never contributed to the WebKit project. It has always lived in the separate Chromium tree, making it pretty hard to use for non-Chrome purposes.Before we wrote a single line of what would become WebKit2 we directly asked Google folks if they would be willing to contribute their multiprocess support back to WebKit, so that we could build on it. They said no.
  15. Oct 2019
    1. I'd say that "dump" in the CS sense, both as noun and verb, is merely another application of its preexisting meanings even without the vulgar one, particularly the ones related to unloading/releasing contents. (For example, "dump truck".)
    2. For some geeky reason, the computer programming world has long maintained a tradition of using words in new ways, with a studied obliviousness to their prior, rude meanings: for example, 'dump'. 'Falsey' is merely another word in this long, and quite useful, tradition.
    1. the CMfg paradigm and concept provides a collaborative network environment (the Cloud) where users can select the suitable manufacturing services from the Cloud and dynamically assemble them into a virtual manufacturing solution to execute a selected manufacturing task

      Cloud Computing in SCM

    1. Doing something programatically generally means that you can do it using source code, rather than via direct user interaction or a macro.
    2. The reason SO users explicitly say "programmatically" is to reaffirm that they're asking "programming code" questions and not "IT-style" questions
  16. Sep 2019
    1. Tekst met under construction mogelijk bovenaan, en verwijs naar de knop rechts boven voor de mogelijkheid om feedback achter te laten

  17. Jul 2019
  18. May 2019
  19. Apr 2019
  20. Feb 2019
    1. He thought that networked digital computing could release and channel neural power in the same way that physics had released and channeled nuclear power, but to far more beneficial effect.

      This is a very powerful idea.

  21. Dec 2018
  22. Nov 2018
    1. Holographic computing made possible

      Microsoft hololens is designed to enable a new dimension of future productivity with the introduction of this self-contained holographic tools. The tool allows for engagement in holograms in the world around you.

      Learning environments will gain ground with the implementation of this future tool in the learning program and models.

      RATING: 5/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

  23. Oct 2018
    1. One is the linked list of lines you mention. I believe this is intended to solve a display problem that TECO (the original language in which Emacs was implemented) had solved differently using the "gap" data structure. The fundamental issue was that if you have a buffer represented as a single block of contiguous text, then insertion on a character-by-character basis can be O(n2), each time you insert a character, you have to copy the entire subsequent buffer over one space.

      implementation, performence of text entry

    2. Lisp macros were also useful for the definition of new control structures, as well as new data structures. In ZWEI, we created a new iterative control structure called charmap, which iterates over characters in an interval. Intervals are stored as doubly-linked lists of arrays, and the starting point might be in the middle of one array and the ending point might be in the middle of another array. The code to perform this iteration was not trivial, and someone reading it might easily not understand the function it was performing, even though that function was the conceptually simple one of iterating over characters. So we created a macro called charmap that expands into the double-loop code to iterate over the characters. It is simple and obvious, and is used in many places, greatly reducing the size of the code and making the functionality obvious at a glance.

      use of macros implementing data structures making things more readable!

    3. It became policy to avoid abbreviations in most cases. In ZWEI, we made a list of several words that were used extremely often, and established 'official' abbreviations for them, and always used only those abbreviations. ... Words not on this list were always spelled out in full.

      abbreviations whitelist - good programming practice!

    4. The use of the mouse is still considered experimental. We know of several editors which depend highly on the use of a mouse for input, but we are not convinced that it is better than a keyboard; after more people start using ZWEI, it will be interesting to see how many of them make heavy use of the mouse and how many hardly use it at all.

      mouse considered experimental mouse better than keyboard?

    5. Since ZWEI is written in Lisp and lives in the Lisp environment of the Lisp machine, it is in a very good position to interface closely with other elements of that environment.

      living system interacting with a running lisp machine

    6. ZWEI is display-oriented: the text the user is editing is actually displayed (this is relevant because many editors of the time often showed out-of-date text due to efficiency and bandwidth restrictions, putting the burden on the user to imagine what their text looks like currently).

      bandwith restrictions -> out of date text -> user has to imagine what it currently looks like

    7. Some paragraphs are devoted to what must have been a novel concept at the time for such a system: that the Lisp Machine was a personal system, not time-shared, and this gave rise to features not viable on time-sharing systems, due to the fact that the user was not contending with other users for resources.

      personal computers as novel concept (vs time sharing) and what it enables

  24. Aug 2018
    1. “... applications and services that facilitate collective action and social interaction online with rich exchange of multimedia information and evolution of aggregate knowledge...” [48]

      Social computing definition

      Humans perform a social role while communication is mediated by technology. The interaction between human social role and CMC is key here.

    1. In particular, the fact that most users are only now beginning to experience the ubicomp vision and integrate this new, unique class of technology into their work practices suggests that another change in focus may be on the horizon: “[T]he shift from user-centered design to context-based design corresponds with recent developments in pervasive, ubiquitous computing networks and in the appliances that connect with them, which are radically changing our relationships with personal computing devices” (Gay & Hembrooke, 2003)

      Influence of ubiquitous computing on HCI

    2. Activities also span place; that is, it is common for work to take place outside of the immediate office environment. However, current office technologies sometimes present a very different view of information across different physical and virtual settings.

      "Activities exist across places"

      Here the paper conceptualizes "place" as physical location as well as mobile environment.

    3. The idea that activities may exist at different levels of granularity is not a new one. Boer, van Baalen & Kumar (2002) provide a model explaining how an activity at one level of analysis may be modeled as an action—a component of an activity—at another. This holds true for individual users, as in the example provided above, but is even more pronounced when a single activity is viewed from multiple participants’ perspectives.

      "Activities exist at different levels of granularity"

      Hierarchical level of analysis; Action < Activity

      The idea of granularity also seems to have a temporal component. See examples before this passage.

    4. Additionally, activities need to be represented in such a way that their contents can be shared, with the caveats that individual participants in an activity may have very different perceptions of the activity, they may bring different resources to play over the course of the activity, and, particularly for large activities in which many individual users participate, users themselves may come and go over the life of the activity.

      Large group social coordination challenges are particularly salient to the SBTF studies.

    5. Recognizing the mediating role of the digital work environment in enabling users to meaningfully collaborate is a critical step to ensuring the success of these systems.

      "Activities are collaborative"

      Activity representations are also crucial here, as is the "mediating role of the digital work environment" for collaboration.

      Flag this to connect to the Goffman reading (Presentation of Self in Everyday Life) and crowdsourcing/collective intelligence readings.

    6. User studies and intuition both suggest that the activities that a knowledge worker engages in change—sometimes dramatically—over time. Projects and milestones come and go, and the tools and information resources used within an activity often change over time as well. Furthermore, activities completed in the past and their outcomes often impact activities in the present, and ongoing activities will, in turn, affect activities that will be undertaken in the future. Capturing activity over the course of time has long been a problem for desktop computing.

      "Activities are dynamic"

      This challenge features temporal relationships between work and worker, in the past/present sense, and work and goals, in the present/future sense.

      Evokes Reddy's T/R/H temporal organization of work and Bluedorn's work on polychronicity.

    7. Supporting the multifaceted aspects of activity in a ubicomp environment becomes a much more complex proposition. If activity is to be used as a unifying organizational structure across a wide variety of devices such as traditional desktop and laptop computers, PDAs, mobile telephones, personal-server style devices (Want et al., 2002), shared public displays, etc., then those devices must all be able to share a common set of activity representations and use those representations as the organizational cornerstone for the user experience they provide. Additionally, the activity representations must be versatile enough to encompass the kinds of work for which each of these kinds of devices are used

      "Activities are multifaceted"

      This challenge is premised on having a single unit of analysis -- activity -- and that representations of the activity are both valid (to the user) and versatile (to the work/task type)

    8. The challenges exist due in large part to the inherent complexity of human activity, the technical affordances of the computing tools used in work practice, and the nature of (and culture surrounding) knowledge work.

      Reasons behind the knowledge work challenges.

    9. We describe five challenges for matching computation to activity. These are: •Activities are multifaceted, involving a heterogeneous collection of work artifacts; •Activities are dynamic, emphasizing the continuation and evolution of work artifacts in contrast to closure and archiving; •Activities are collaborative, in the creation, communication, and dissemination of work artifacts; •Activities exist at different levels of granularity, due to varying durations, complexity and ownership; and •Activities exist across places, including physical boundaries, virtual boundaries of information security and access, and fixed and mobile settings.

      These challenges also have temporal qualities, e.g., tempo/speed, duration, timeline, etc.

  25. May 2018
    1. One of the largest-scalestudies exploring this problem was undertaken at the University of Wash-ington (Fidel et al., 2000), where researchers investigated the information-seeking behavior of teams from two different companies, Boeing andMicrosoft (Poltrock et al., 2003). They found that each team had differentcommunication and information-seeking practices, and that current infor-mation systems are oriented toward individual rather than collaborativeinformation-seeking activities. In practice, though, information seeking isoften embedded in collaboration

      SBTF uses Google Sheets and Docs for information collection and shared documentation. Though Google products are billed as cloud-computing collaboration tools, it would be interesting to know if these systems remain oriented in individual information-seeking activities rather than collaborative.

  26. Apr 2018
    1. 边缘计算是一个技术呢?还只是一个市场语言,把各种技术包装起来而已?

    2. 越来越多的数据从边缘产生,而不是云。 虽然数据中心的处理能力远超边缘,但是带宽的发展跟不上边缘数据生产速度。 上云处理的代价非常之高。

      文章是从整个信息行业,数据生产、带宽发展、IOT的需求,来推导出边缘计算的必要性。 但是,如果从具体行业看,似乎并不是这样的。

    3. SomeIoT applications might require very short response time, somemight involve private data, and some might produce a largequantity of data which could be a heavy load for networks

      边缘计算最大驱动力是IOT产生的极大量数据,云计算架构不是完全合适。 IOT的关键需求包括: 响应延时、网络带宽、隐私、数据存储

  27. Mar 2018
  28. Nov 2017
    1. As in Slow Food—with its unhygienic soil, disorderly farmers’ markets, and inconvenient seasons—the annoyances of Slow Computing have become pleasures. With community-made software, there’s no one to blame but us, the community. We’re not perfect, but we’re working on it.

      I really feel like the analogy works. I have for example begun to take pleasure in the messiness of vegetables bought at a farmers' market compared to the seeming perfection of those a a grocery store.

  29. Jul 2017
  30. Mar 2017
    1. Seco, si bien es una implementación del 2004, tiene varias ideas que son similares a las de Grafoscopio de hoy, incluyendo la persistencia de una imagen (ellos usan HyperGraphDB, pero incluso mencionan Smalltalk), el hecho de ser una aplicación de escritorio y la idea de una computación p2p, o la opción de embeber el motor de rendering de un browser o el browser mismo en un ambiente más rico, incluso la inspiración de los notebooks de mathematica.

  31. Dec 2016
    1. Now,thesuggestedexecutiontimeforaBASICprogrammaticsolutiontoPuzzle15is7minutes,4seconds.That'sontheVectra.IfyouareprogrammingonaTandy1000,youcouldexpectthesameprogramtoexecuteinabout28minutes.So,ifyoursolutiontakesoveranhour,youmighttrytospeeditupsomewhat.

      How times have changed! Project Euler suggested run times of less than a minute, but here the author blithely suggests that waiting an hour for your solution may be too much.

    Tags

    Annotators

  32. Sep 2016
    1. The success of Arduino has had the perhaps retrograde effect of convincing an entire generation that the way to sense and actuate the physical world is through imperative method calls in C++, shuffling bits and writing to ports, instead of in an environment designed around signal processing, control theory, and rapid and visible exploration. As a result, software engineers find a fluid, responsive programming experience on the screen, and a crude and clumsy programming experience in the world.
  33. Aug 2016
    1. WISP (Wireless Identification and Sensing Platform), a computer chip powered by existing radio waves, developed by the U. of Washington Sensor Lab and Delft U. of Technology.

  34. Dec 2015
    1. this week’s announcement by Google that a machine made by a Canadian company, D-Wave Systems, which is marketed as “the world’s first commercial quantum computer”, had shown spectacular speed gains over conventional computers. “For a specific, carefully crafted proof-of-concept problem,” Google’s Hartmut Neven reported, “we achieved a 100-million-fold speed-up.”
  35. Oct 2015
    1. The Coming of OERRelated to the enthusiasm for digital instructional resources,four-fifths (81percent) of the survey participants agreethat “Open Source textbooks/Open Education Resource(OER) content “will be an important source for instructional resources in five yea
  36. Aug 2015
    1. Shared information

      The “social”, with an embedded emphasis on the data part of knowledge building and a nod to solidarity. Cloud computing does go well with collaboration and spelling out the difference can help lift some confusion.

  37. Apr 2014
    1. Over the last twenty years, the open source community has provided more and more software on which the world’s High Performance Computing (HPC) systems depend for performance and productivity. The community has invested millions of dollars and years of effort to build key components. But although the investments in these separate software elements have been tremendously valuable, a great deal of productivity has also been lost be cause of the lack of planning, coordination, and key integration of technologies necessary to make them work together smoothly and efficiently, both within individual PetaScale systems and between different systems. It seems clear that this completely unco ordinated development model will not provide the software needed to support the unprecedented parallelism required for peta/exascale computation on millions of cores, or the flexibility required to exploit new hardware models and features, such as transact ional memory, speculative execution, and GPUs. This report describes the work of the community to prepare for the challenges of exascale computing, ultimately combing their efforts in a coordinated International Exascale Software Project.