- Jun 2019
-
www.greenbuildexpo.com www.greenbuildexpo.com
-
Hotels – based on average hotel water usage per occupied room (L) in Chicago, Illinois (Cornell Hotel Sustainability Benchmarking Index 2016: Energy, Water, Carbon).
Okay, so if we tracked water we would need some idea of who was in hotels
-
based on total gallons of water consumed to produce total gallons of gasoline consumed
Ah, so this is a bit like scope 3 - as in water that was needed to make the gasoline that the cards and vans used.
-
The exhibitor will not use individual waste containers in exhibit booths. The venue and show management will provide recycling stations throughout the exhibit area for attendee and exhibitor use during show hours. Each exhibitor is responsible for disposing of waste and recyclables at these stations.
Ah, to capture the waste, and not end up with a case of exhibitors sending their own crap to landfill untracked
-
COMPUSYSTEMS: Through months of testing badge materialoptions, Greenbuild and CompuSystems found a paper-based badge solution that works for Greenbuild and can be used for any other show looking to eliminate the cost and waste associated with plastic name badge holders
A way to get rid of these awful plastic lanyard badge holders?
-
For the first time, a focused event sustainability education session was offered to event organizers, venues, and vendors. The session featured the Greenbuild sustainability team along with our partners, sharing how Greenbuild incorporates sustainability throughout the event as well as offering actions and information on how to include greener practices into any event management strategy.
So, to the sponsors and venue people and folks involved, they had a training session and webinar shared
-
By tweaking this objective to “Lead the Event Industry Through the Advancement of Sustainable Event Management Initiatives,” we focused on how Greenbuild could do better for the event industry as a whole.
This is pretty similar to "work openly and share what works", basically
-
McCormick Place was located a few miles from most of the hotels in our room block. Knowing this would impact our carbon footprint, Greenbuild partnered with ride-sharing operator Lyft to encourage event participants to share rides to and from the convention center and hotels. Lyft offset all the carbon emissions for rides taken to and from McCormick Place during the dates of the conference. Attendees received a 5% discount off their fare to encourage the use of Lyft. Almost 20% of Greenbuild attendees used our code.
20% of attendees used Lyft
-
Field to Table served as the caterer for the Celebration. They are committed to sourcing almost everything locally, finding ingredients from farmers in Illinois and bordering states. They were also a fantastic partner in managing waste. All waste leaving the kitchen was weighed, and they had front of house and back of house recycling and composting stations with volunteers to help us achieve waste diversion goals.
So, this is how they came up with the figures then.
-
Greenbuild 2018 Chicago Sustainability Pledges
Get attendees to consider how they got around while at the event, and where they stay
-
This year, Greenbuild collaborated with Waste Management to help us engage and train a team of 400+ volunteers about proper materials management. Through back-of-house tours and daily training, they worked together to drive as much material to recycling and compost
There was a training program needed for the volunteers at the conf
-
A significant portion of our reduction calculations came from the enforcement of Greenbuild’s 500 piece maximum limit on publications that media partners can bring to the show
So, they don't try to track it, but they have a maximim limit on swag they can bring.
-
attendees were asked to indicate if they wanted a printed guide during registration, significantly reducing overages in printing
Simple. Just ask if it's wanted.
-
-
www.cis.upenn.edu www.cis.upenn.edu
-
Unlike the Gold Standard or CDM, VCS has no requirement that its carbon offset projecthave additional social benefits, allowing for a wider range of projects.
Ah, so THAT'S the key difference
-
-
www.transportenvironment.org www.transportenvironment.org
-
We recommend to prioritise battery-electric and hydrogen (pure and/or in the form of ammonia) technologiesfrom sustainable renewablesourcesto decarbonise shipping
So, hydrogen fuel cells are less energy dense than ammonia fuel cells, but they need extra reformers to turn the ammonia to hydrogen to use for power, and ammonia is toxic. So it's a trade off of energy density versus relative simplicity
-
-
www.climatechangenews.com www.climatechangenews.com
-
Chancellor Merkel was probably also the first world leader to recognize that fighting climate change at its roots in Africa was probably a better way to stem the flow of migrants than to invest in more patrol boats along the Mediterranean coast.
It's good that she's recognised for this, but oh god, it's depressing that this is even considered more than a basic level understanding.
-
-
www.carbonbrief.org www.carbonbrief.org
-
This said global greenhouse gas emissions should reach net-zero by 2070 to limit warming to 1.5C, with CO2 at net-zero by 2050.
Presumably this is because other non CO2 gases don't hang around for as long, I think
-
“This is a whole economy target…and we intend for it to apply to international aviation and shipping.”
🤯🤯🤯
-
- May 2019
-
www.theregister.co.uk www.theregister.co.uk
-
MIT professor Eric Von Hippel’s research into user-driven innovation shows that when the user is given direct access to the means for creating a solution, potentially at least, tremendous innovation can result. The "sticky information" that otherwise must be rendered into requirements documents or transferred from user to builder is difficult and never complete. When this step doesn’t have to take place because user and builder are the same person or same team, the outcome is much better. Amazon’s Away Team model embraces this concept and allows teams to create building blocks that have ideal fit to purpose.
This feels a bit like the system of theft from Simon Wardley's stuff
-
-
theshiftproject.org theshiftproject.org
-
588
It's not obvious to me what kind of server this is (2U? 4U?, some rando commodity one?) but we at least have a figure.
-
-
doteveryone.org.uk doteveryone.org.uk
-
Every technology worker that leaves a company does so at a cost of £30,000.1 The cost of not addressing workers’ concerns is bad for business — especially when the market for skilled workers is so competitive.
30k per employee. What's the typical turnover in a given company per year? You lose of your staff 3% per annum?
-
Although the idea of a ‘hippocratic oath’ for tech has often been discussed as a way to embed ethical practice in the tech industry, only 2% saw a voluntary commitment as the most effective way to mitigate potential harms.
Useful stats when referring to the pledge
-
In AI, 59% of people have experience of working on products that they felt might be harmful for society, compared to 28% of tech workers as a whole.
Cripes :0
-
-
www.inkandswitch.com www.inkandswitch.com
-
Local-first applications keep the primary copy of the data in files in each device’s local filesystem, so the user can read and write this data anytime, even while offline.
Local first is different to offline-first
-
-
www.theccc.org.uk www.theccc.org.uk
-
Aggregation as 'carbon dioxide equivalent'73 fails to capture this fundamental difference in how emissions of short-lived and long-lived GHGs affect global temperature. However, other constraints such as international comparability (Box 2.4) support the continued use of existing 'CO2 equivalence' metrics for now
So using CO2e doesn't make total sense here, as the time in the atmosphere means it's not quite compatible
-
Distribution of risks. The additional increase in climate risk between 1.5°C and 2°C warmingwould affect poor and vulnerable people most of all. Poverty and disadvantage have increasedwith recent warming and are expected to increase for many populations as average globaltemperature increases from 1°C to 1.5°C and higher.
Does this imply you would have much less climate migration betwee 1.5 and 2 degrees?
-
Climate change has affected crop yields, with more negative impacts than positive effects. Climate change has been acting to reduce global average yields of wheat and maize by around 1% per decade since 1960, preventing yields increasing as fast as they would otherwise have done due to other factors.
First case I've seen of it being cited as a factor already
-
Future climate risks to society are dependent on the interaction of hazard, exposure and vulnerability:
This is slightly different to Risk = impact X frequency
-
Around 420 million fewer people would beexposed to extreme heatwaves if warming was kept to 1.5°C than 2°C.
holy caw. Like the heatwaves that killed 90k people in Western Europe? Those heatwaves?
-
- Apr 2019
-
www.transportenvironment.org www.transportenvironment.org
-
Energy requirements under different technology pathways for a Calais-Dover journey
Those tradeoffs. Wow
-
-
render.com render.com
-
Simple, fair, predictable pricing.Deploy your first app in minutes.
This is the other alternative to Heroku, offering a nice developer experience
-
-
www.carbonbrief.org www.carbonbrief.org
-
The rule, which was decided at the last minute and reportedly pushed by Saudi Arabia, means fuels with a 10% or higher reduction in lifecycle emissions compared to standard “Jet A1” can be counted as “clean oil” and eligible for in-sector offsets.
o_O
-
However, the requirements only apply for operators with international emissions above 10,000tCO2 per year. This means most of the world’s private jets are exempt.
Private jets, the most egregious of emitters, are exempt. Wow.
-
International aviation alone is responsible for around 1.3% of global CO2 emissions, according to ICAO.
So, international aviation is greater than domestic
-
Significantly, these estimates do not account for the impacts of aircraft emissions other than CO2, such as nitrogen oxides (NOx) and soot. These factors are thought to more than double the global warming impact of aviation.
ZOMG
-
-
-
The solution enables groups from across the company to view emissions data based on group-specific access parameters; on initial rollout, we restricted each group’s view to data that was immediately actionable by them. For example, our data center team has a global view of data center emissions data, whereas oursubsidiaryfacilities teams can access data for facilities in their specific geographical region. By giving them access to just the data that is relevant to their partof the organization (and not the broader organization), we help eliminate distractions and keep the groups focused on opportunities to drive emissions-reducing initiatives within their areas.
Sacrificing transparency to avoid distractions. Would this also save embarassing some high emitting areas like enterprise sales or exec travel?
-
Vintagerefers to the year that the green power isgenerated. Carbon accounting specifies that the green power purchases be generated in the reporting inventory yearthat the green power will be credited.
You can't buy offsets from 2014 to offset emissions in 2018
-
-
-
Across the seven countries surveyed by the CASPI team, a clear link emerged between people’s wellbeing and the extent to which they engaged in low-carbon behaviours: people who were ‘greener’ also tended to be happier, even taking into account their own personal income.50
oh, wow, not just because they're already rich?
-
There are plenty of campaigns which have been completely oblivious to the relationship between one pro-environmental behaviour and another. One striking and bizarre initiative saw consumers encouraged by a supermarket marketing campaign to “turn lights into flights” by earning “air miles” through the purchase of energy-efficient lightbulbs.
WAT
-
Many campaigns in the 1990s and 2000s focused on ordinary citizens ‘doing their bit’ through reducing their personal carbon-footprints.8 But there was then a concerted shift away from openly talking about the role of individual behaviours, in part reflecting the failure of most campaigns to bring about meaningful changes in people’s behaviours (and more importantly their carbon footprints
Ahhhhhhhhh
-
Carbon emissions increase sharply with income: the top 10% of emitters are responsible for close to half of all emissions; and much of this difference is underpinned by household income
ZOMG
-
-
extinctionrebellion.files.wordpress.com extinctionrebellion.files.wordpress.com
-
ur focus is on creatings systems ofdecision-making, like Citizens’ Assemblies, where ordinary people learn from each other and help usall to take collective decisions.
This doesn't come across so strongly in the protests yet, but it's an important point
-
-
www.theguardian.com www.theguardian.com
-
Air pollution contributed to nearly one in every 10 deaths in 2017, making it a bigger killer than malaria and road accidents and comparable to smoking, according to the State of Global Air (SOGA) 2019 study published on Wednesday. In south Asia, children can expect to have their lives cut short by 30 months, and in sub-Saharan Africa by 24 months, because of a combination of outdoor air pollution caused by traffic and industry, and dirty air indoors, largely from cooking fires. In east Asia, air pollution will shorten children’s lives by an estimated 23 months. However, the life expectancy burden is forecast to be less than five months for children in the developed world.
And we still SUBSIDISE fossil fuels
-
- Mar 2019
-
medium.com medium.com
-
The secret sauce to mitigating uninformed opinions is to explicitly state where you are in the decision cycle, and your confidence in the decision at this stage. Add in a dash of the decision’s importance and you’ve got a reliable, repeatable template for how to communicate where you’re at.
This is a really nice way to frame it.
-
-
theshiftproject.org theshiftproject.org
-
Despite the existence of these two contradictory trends, even the most optimistic studies express concerns regarding the capacity of technological progress to counter the growth in volumes by 2020.For example, this report from the American Department of Energy and the University of California on the energy consumption of data centers in 2016 in the United States, states:"The key levers for optimizing theenergy efficiency [of data centers] identified in this report, better PUE, better rate of use of servers and more linear consumption all have theoretical and practical limits and the amount of progress already achieved suggests that these limits will be reached in the relatively near future."(Shehabi, A. et al., 2016)
Okay it was that same paper they referred to.
-
India plans to launch a massive program to deploy commercial 5G networks in 2020 to boost the performance and capacity of existing mobile networks, taking into account that the 4G networks (which only took off in 2017 due to a price war over data started by the telecommunications operator Reliance Jio)are making big advances towards general coverage.
Hello, so they do reference the massive increase in data and data plans
-
Scenario 3 –ideal case, where the exchanges are made exclusively with the platform.
His number seems super duper high
-
We have therefore sought to identify levers of action more closely related to the demand and consumption of digital services than on the energy efficiency of supply.
This is long overdue, I'm ready Glad to see this
-
Spending 10 minutes watching a high definition video by streamingon a smartphone is equivalent to using a 2,000W electric oven at full power for 5 minutes
Waaaaaaaat
-
- Feb 2019
-
www.nature.com www.nature.com
-
Credibility-enhancing displays promote the provision of non-normative public goods
Not very jazzy title, but useful reference
-
-
-
Last year, Google quietly started an oil, gas, and energy division. It hired Darryl Willis, a 25-year veteran of BP, to head up what the Wall Street Journal described as “part of a new group Google has created to court the oil and gas industry.” As the VP of Google Cloud Oil, Gas, and Energy, Willis spent the year pitching energy companies on partnerships and lucrative deals. “If it has to do with heating, lighting or mobility for human beings on this planet, we’re interested in it,” Mr. Willis told the Journal. “Our plan is to be the partner of choice for the energy industry.”
Jeez. At what point do we grow a spine and take climate change seriously?
-
-
www.greenpeace.org www.greenpeace.org
-
Salesforce was the first major internet company that exclusively leased data center space to adopt a 100 percent renewable energy commitment in 2013. Salesforce has multiple data center leases in Data Center Alley, totaling 46 megawatts, including a massive new lease with QTS in its new Manassas data center.[
How to do green DCs when you don't own DCs
-
But despite recent creative claims of being “100 Percent Renewable Globally” from surplus supply of renewable credits in other markets,[66] Google has not yet taken steps to add renewable energy to meet the demand of its data centers in Virginia
Ah! So they do the "RECs in other markets" too!
-
In 2018, five major IT brands with long-term commitments to renewable energy[52] and who operate data centers or have significant colocation leases in Virginia sent a letter to the Virginia State Corporation Commision (SCC) asking that they not be used by Dominion to justify new fossil fuel growth, asking instead for a greater supply of renewable energy.[53] The SCC ultimately rejected Dominion’s Integrated Resource Plan for the first time in December 2018, providing an important opportunity for additional large corporate customers to tell regulators they need a greater supply of renewables, not more investment in fossil fuel generation assets or pipelines like the ACP.[54]
Wait, so these things two things are related? The letter forced the SCC to respond?
-
The rapid deployment of renewable energy and the stagnation of mandatory renewable energy targets in many states has created a large surplus of “naked” or unbundled renewable credits available at the national level for purchase by the voluntary market, driving their price to record lows, less than $1/megawatt hour.
So, if you're a huge buyer of electricity, and you are opaque about your offsets, it's easy to imagine that you're just loading up on these.
-
AWS customers seeking to immediately reduce carbon emissions related to their cloud hosting could request to be hosted in Amazon’s California cloud, which is connected to a grid that is 50[33] to 70[34] percent powered by clean sources of electricity
Not oregon?
-
Dominion’s projected demand for the pipeline ignores the fact that six of its 20 largest customers, five of which are data center operators, have made commitments to run on 100 percent renewable energy.[
How can you publicly audit a commitment like this?
-
However, neither of these options improves the energy mix of Virginia or influences future direction and is therefore not ideal for those companies concerned with meaningfully reducing their operational carbon emissions. Of the 15 companies measured in this report, only Apple has invested in enough renewable energy procurement to match its demand in the region
Ok, this makes me think that companies are relying on RECs everywhere else, and crediting Apple with specifically investing directly in RE in Virginia.
-
Company Scorecard
OK, so this is the table used to create that wild chart above showing DC capacity, compared to renewables capacity in Virginia
-
If Amazon and other internet companies continue their rapid expansion of data centers in Virginia, but allow Dominion to continue with its strategy to use rising data center demand to justify significant new investment in fossil fuel infrastructure, they will be responsible for driving a massive new investment in fossil fuels that the planet cannot afford.
So this is interesting. This report seems to be more about Dominion than anything, else, and basically pressuring amazon to get Dominion to step away from fossil fuels
-
Dominion Energy, Virginia’s largest electricity provider and the primary electric utility for Data Center Alley, has strongly resisted any meaningful transition to renewable sources of electricity, currently representing only 4 percent of its generation mix, with plans to increase to only slightly over 10 percent by 2030.[1]
Wow, 10% by 2030? That it?
-
-
blog.gardeviance.org blog.gardeviance.org
-
The purpose of spend control is simply to challenge what we’re doing. It’s important to remember that it doesn’t control the budgets. Those budgets belong to other departments or business units. Spend control simply says that if we’re going to spend more than £25K on something then that has to be presented to us and exposed to a bit of challenge.
Easy enough to explain, and anchor stuff to.
-
-
realtimeboard.com realtimeboard.com
-
The ultimate guide to remote team meetings
This was pretty accurate - it covers a lot of useful ground.
-
-
www.cfr.org www.cfr.org
-
Today the decline in deaths from the traditional diseases of density—tuberculosis and diarrheal and intestinal infectious diseases—in many large cities is due less to prevention through infrastructure improvements and public health oversight (see below). Instead the evidence suggests that treatment (e.g., antibiotics, childhood vaccines, oral rehydration solutions), rather than prevention (e.g., clean water, sanitation, and strong urban public health systems), has mattered the most
Did not expect
-
-
mcaffer.com mcaffer.com
-
When I first started thinking/talking about this, I use the term maturity with people. That unfortunately implies a judgement and it turns out that people don’t want to think of themselves (or be thought of) as immature. Most of the terms that came up had that same characteristic: sophistication, evolution, depth, … The antonyms are off-putting. Fact is, there is no one right answer here — there is nothing inherently better or worse about being at a particular point on the spectrum. That led me to engagement. We really are just talking about what kind, what role, how much, … your engagement with open source plays in your organization. Naming is hard. I’d love to hear alternatives.
I really like this framing of it. Sure, maturity is a nice way to categorise, human frailty is worth taking into account
-
-
us9.campaign-archive.com us9.campaign-archive.com
-
pite the coal-friendly policies of the central government. A study showed that Australia is currently installing 250 watts of PV or wind for each inhabitant per year. The EU and US are about one fifth of this. If this rate of growth continues, Australia will reach 50% renewables by 2024 and 100% of electricity demand by 2032. Costs of new large scale PV and wind are now around US35/MWh, lower than the running costs of older coal stations.
Wow, go Australia
Tags
Annotators
URL
-
-
-
We look at the different types of ‘dark matter’ at play in the system around complex disadvantage — law and guidance, strategies and policies, contracts and processes. We examine how what can seem to be the law may actually be someone’s interpretation of it. We look at how through investigation you can start to question rules, by discovering that the reason the rule was initially created may have changed, so that the rule is no longer needed.
This is a bit like the different kinds of inertia in Wardley mapping
-
-
www.ghgprotocol.org www.ghgprotocol.org
-
Williams and Tang (2013)8performed a rigorous and detailed energy consumption analysis of three cloud-based office productivity applications. They analyzed the power consumption ofthe data center, network, and user devices that access the cloud service. The study also performed an energy consumption analysis on “traditional” noncloud versions of the software to understand the overall impact of cloud services.
Are the findings accessible publicly?
-
Average power consumption can be estimated from single points, although this results in increasing uncertainty. Manufacturers publish the maximum measured electricity (MME)value, which is the maximum observed power consumption by a server model. The MME can often be calculated with online tools, which may allow the specification of individual components for a particular server configuration. Based on these estimations of maximum power consumption, the average power consumption is commonly assumed to be 60 percent of MME for high-end servers and 40 percent for volume and mid-range servers.
okay, this is a useful stat. I think
-
In most cases, the “embodied emissions” (all stages excluding the use stage) of software are not significant compared with the overall emissions of the ICT system, particularly when the embodied emissions caused by development of the software are amortized over a large number of copies. In these cases, it is not necessary to carry out a detailed life cycle assessment of the software as part of a wider system. An exception is where bespoke software has very high emissions associated with its development, and these emissions are all allocated to a small number of software copies.
Smaller, internal software might count
-
Currently, input-output(IO)tables are published every five years, a long time in IT product evolution. Consequently, EEIO is good at representing basic commodities / materials industries like plastics or metals manufacturing, but not high-tech industries like microprocessors and fiber optic lasers manufacturing.
Every 5 years. So, when the iPhone 6 was the brand new hotness, compared to today.
-
Calculating cradle-to-gate GHG emissions of IH by the component characterization method
Ah! This is new to m
-
EEIO tables are updated infrequently thus may not be up to date with ICT’s newest technologies and materials. EEIO tables have limited resolution at the aggregate sector level.
Valuable problem to solve?
-
rapidly with the onset of innovations, but lag in being included in EEIO databases available to the practitioner. More detail on EEIO data is provided in the calculation sections below
Useful point. Because the top-down data is lagging, it'll give worse than expecred figures for hardware
-
It is interesting to note that the figures from GSMA and GeSI show that energy intensity per gigabyte is improving at about 24% per year for mobile networks, and at about 22% per year for fixed line networks.(The study by Aslan et al calculates a figure of 50% reduction in energy intensity every two years for fixed line networks, equivalent to 29% reduction per year).Also the data shows that the energy intensity per gigabyte for mobile networks is about 50 times that for fixed line networks.
Okay, this isn't that far from the 45x figure before
-
Assuming that the reduction in energy efficiency can be fitted to an exponentially decreasing curve (i.e. because it is more and more difficult to achieve the same reductions), then the data points can be extrapolated to give energy intensity factors for 2015 of 0.15 for fixed linenetworks, and 6.5 for mobile networks, with both factors measured in kWh/GB (kilowatt-hours per gigabyte).
FORTY FIVE TIMES MORE ENERGY INTENSIVE THAN WIRED
-
A simple energy intensity factor for the use of the internet would make calculating the emissions resulting from ICT simpler and more widely accessible. Whilst this has been attempted in the past, resulting estimates show huge disparities. Coroama and Hilty6review 10 studies that have attempted to estimate the average energy intensity of the internet where estimates varied from 0.0064 kWh/GB to 136 kWh/GB, a difference factor of more than 20,000.
TWENTY THOUSAND TIMES DIFFERENCE.
Did I drive across London? Or did I drive to the moon?
-
Typically, for cloud and data center services the largest impacts are from the use stage emissions of the data center and the end user devices.
End user devices?
-
Simplified parameters for allocation of data center emissions include
Useful for the screening stage
-
Optional processes that are not attributable to the GHG impact of cloud and data center services are:
?
-
The end-of-life stage typically represents only -0.5 to -2 percent of a service’s total GHG emissions. This is because of the high level of recycling of network equipment.
Where would I check to learn this?
-
For global average electricity emission factors across 63 countries where the points of presence were located, data from the Carbon Trust Footprint Expert Database was used.
Is this database free?
-
Operational activities and non-ICT support equipment covers people (labor)-activities and non-ICT support equipment and activities that are directly engaged and dedicated to the service being assessed.
in addition to the other emissions
-
For example, these measurements might involve running a series of traffic traces16over a period of time to build up statistics on network parameters. The measurements also need to include the energy consumption for the network’s ancillary equipment such as cooling, power conditioning, and back-up power. If this latter data is not attainable, then techniques described in Section 2.8.2“Calculating GHG emissions for the customer domain use stage,” (TPCF and PUE factors), can be used to provide an estimated value for this equipment.
Validation!
-
Equipment manufacturers may have estimates of TPCFs for their equipment based on defined operating conditions. In all cases, if a TPCF approach is selected, then the basis for selecting the factor should be fullynoted and documented in the GHG inventory report
How much do these figures fluctuate for cloud boxen?
-
Allocation of emissions among independent products that share the same process: for example, multiple products sharing the same transport process (vehicle); multiple telecommunication services sharing the same network; multiple cloud services (email, data storage, database applications) sharing the same data center
k8s makes this a pain, if its designed to co-mingle services on the same boxen
-
Depending on the goal and scope of the assessment, a rule of thumb may be used for assessing ICT products where theemissions from a specific life cycle stage or element are determined by the screening assessment to be less than 5 percent of the total emissions. In this case, a detailed assessment for that stage or element is not required. The emissions for that stage or element are then calculated using the percentage determined in the screening assessment. The sum of the emissions calculated in this way (i.e., based on the percentage from the screening estimate) should not exceed 20 percent of the total emissions.
Less than 5? skip it
-
A “screening assessment” is an initial assessment of a product to understand its significant and relevant sources of emissions. This assessment is described in the Product Standard in section 8.3.3.
Entry level
-
This chapter provides software developers and architects guidance to benchmark and report the GHG emissions from software use in a consistent manner and make informed choices to reduce greenhouse gas emissions. The chapter is in two parts. Part A provides guidance on the full life cycle assessment of software, while Part B relates specifically to the energy use of software, and covers the three categories of software: operating systems (OS), applications, and virtualization.
actual formal guidance!
-
2015 GeSI published the SMARTer 20308report, extending the analysis out to 2030. This study predicted that the global emissions of the ICT sector will be 1.25 Gt CO2e in 2030 (or 1.97% of global emissions), and emissions avoided through the use of ICT will be 12 Gt CO2e,which is nearly 10 times higher than ICT’s own emissions.
theres a 2030 report now. I did not know
-
the total abatement potential from ICT solutions by 2020 is seven times its own emissions.
increasing confidence in the potental then
-
Rebound Effects
🤯
-
The Product Standard (sections 11.2 and 11.3.2) states that“avoided emissions shall not be deducted from the product’s total inventory results, but may be reported separately.”
ah, so its not like a magic offset. more transparent. good.
-
The Product Standarddefines products to be both goods and services, thus for the ICT sector it covers both physical ICT equipment and delivered ICT services. This Sector Guidance, however, focuses more on the assessment of ICT services. In this Sector Guidance the definition of products includes both networks and software as ICT services.
this makes me think that services like e-commerce or ride sharing might not count on he first read thru
-
- Dec 2018
-
www.ineteconomics.org www.ineteconomics.org
-
As the chief executive of the world’s biggest cement company observed, “we know how to make very low carbon cement – by why would we? There is no incentive.”
FUCKING HELL
-
In the growing trade war between China and the US, it seems the world is unwilling even to think about the entirely legitimate use of consumption-based or border carbon pricing either to encourage cleaner production in China, or to deter the Trump administration from using discriminatory trade measures to re-industrialize drawing partly on older and more carbon-intensive technologies.
How would border carbon pricing work? You pay a tax on the CO2 emissions 'imported'?
-
The European utilities that tried to ignore the energy transition are now economic zombies; some split their companies in two to try and isolate the assets that have already turned into liabilities in a decarbonizing system.
E-on as an example?
-
Averaged over the full 35 years, a constant percentage reduction would require c = - 10%/yr to reach the same end-point – almost impossible at the starting point, but entirely feasible and easily observed in the latter stages of sunset industries.
Okay, so the argument as I see it so far is that, change, while averaged out might be 3.68 per year, but assuming it's a straight line, is a mistake, as substition of high carbon energy to low carbon looks more like an S shaped curve
-
their analysis leads both teams to the – only slightly caveated - conclusion that the emission reductions required to the deliver the Paris Aims (“well below 2 deg.C”) are implausible, by almost any standard of macroeconomic evidence – and still more so for the most ambitious “1.5 deg.C” end of the spectrum.
Ah, so this is a response to the we're doomed papers from before
-
-
onlinelibrary.wiley.com onlinelibrary.wiley.com
-
Electricity Intensity of Internet Data Transmission: Untangling the Estimates
This is the html version of the PDf I was referring to before.
-
-
www.iea.org www.iea.org
-
Digitalization and Energy 2017
This is a report from Nov 2018.
Tags
Annotators
URL
-
- Oct 2018
-
www.bbc.com www.bbc.com
-
Video streaming service Netflix is the world's most data-hungry application, consuming 15% of global net traffic, according to research from bandwidth management company Sandvine.
Ah, there's new sandvine report for 2018
Tags
Annotators
URL
-
- Sep 2018
-
-
Which report is this?
-
Oh, I didn't know M$ is part of the OCP, but it makes sense when you think about it
-
-
-
How do you measure up?
-
- Apr 2018
-
aws.amazon.com aws.amazon.com
-
By eliminating cold servers and cold containers with request-based pricing, we’ve also eliminated the high cost of idle capacity and helped our customers achieve dramatically higher utilization and better economics.
Cold servers and cold containers is a term I haven't heard before, but it sums up the waste of excess capacity nicely
-
-
sonniesedge.co.uk sonniesedge.co.uk
-
Robust It is this flirty declarative nature makes HTML so incredibly robust. Just look at this video. It shows me pulling chunks out of the Amazon homepage as I browse it, while the page continues to run. Let’s just stop and think about that, because we take it for granted. I’m pulling chunks of code out of a running computer application, AND IT IS STILL WORKING. Jut how… INCREDIBLE is that? Can you imagine pulling random chunks of code out of the memory of your iPhone or Windows laptop, and still expecting it to work? Of course not! But with HTML, it’s a given.
Tags
Annotators
URL
-
- Mar 2018
-
-
For the five studies that satisfy our criteria, the electricityintensity of transmission networks has declined by factor of170 between 2000 and 2015
It's got 170x more energy efficient in 15 years
-
2Example of daily variation of Internet traffic in 2012, based on number of page views per 15-minute interval for part of theAkamai network (Peill-Moelter 2012, reprinted with permission).
This looks similar the curve in the Power of Wireless Cloud. I wonder if it's the same now?
-
A white paper released byCisco (2015) predicts Internet traffic growth of 42% per yearto 2020.
42% compounding, year on year?
-
the broader trends identified by Koomey andcolleagues (2011) and Koomey and Naffziger (2015, 2016) aresuggestive of the rates of change we would expect to see innetworking devices constructed from silicon microprocessorsand related components.
So assumptions about Moore's law about increasing energy efficiency can be applied
-
Williams and Tang (2012)estimate the carbon intensity
Oh, so they've gone the other way here
-
Estimatesbased on specific or state-of-the-art equipment, such as Baligaand colleagues (2009), omit the less efficient legacy equip-ment (i.e., equipment with higher electricity use per GB ofdata transferred) in use within country-wide Internet networks,resulting in a substantial underestimate of electricity intensityat the lower end of the observed range (0.004 kWh/GB for2008).
Ah, so that's why it's so low - they assumed all the network kit was, new, shiny and frugal
-
Existing estimates for the electricity intensity of Internetdata transmission, for 2000 to 2015, vary up to 5 orders of mag-nitude, ranging from between 136 kilowatt-hours (kWh)/GBin 2000 (Koomey et al. 2004) and 0.004 kWh/GB in 2008(Baliga et al. 2009). While increased efficiency over time canaccount for 2 orders of magnitude of this variation (based onresults presented below), alone it does not explain the spreadof results.
-
For ex-ample, Mayers and colleagues (2014) applied electricity in-tensity estimates as part of an LCA study comparing differentmethods of games distribution, concluding that the carbon-equivalent emissions arising from an Internet game download(for an average 8.8-gigabyte [GB] game) were higher than thosefrom Blu-ray Disc distribution in 2010
I still have a hard reading getting my head around this
-
This article derives criteria to identify accurate estimates over time andprovides a new estimate of 0.06 kWh/GB for 2015.
-
-
www.theguardian.com www.theguardian.com
-
However, the app also collected the information of the test-takers’ Facebook friends, leading to the accumulation of a data pool tens of millions-strong. Facebook’s “platform policy” allowed only collection of friends data to improve user experience in the app and barred it being sold on or used for advertising.
HOLY SHIT
-
-
nataliehanson.com nataliehanson.com
-
He uses the PURE Method for easy of use.
I like this idea, I'm curious about how the split the steps, and how each step is ranked
-
Let’s acknowledge the five purposes of research:
This is worth comparing to mydleton's list
-
-
martinfowler.com martinfowler.com
-
One argument I've heard against this approach is that if everyone did this, then we would run out of pink, sparkly marbles. We'll know this is something to be worried about when women are paid significantly more than men for the same work.
Touché
-
- Feb 2018
-
www.smashingmagazine.com www.smashingmagazine.com
-
The extraterritorial nature of these two frameworks — they protect the privacy rights of people in Europe regardless of where their data is collected — means that they will become the de facto standard for privacy around the world.
I'm not totally clear on how would be enforced yet, but jeepers
-
Your privacy testing procedures should predict the ways unauthorized users would access actual data on your system. Would a suspicious search for user data, or an alteration to a record, be logged as a security vulnerability? Is data stored in login cookies? Could someone gain access to data by intentionally triggering an error?
This sounds a lot like threat modelling.
-
Data should be deleted, either automatically or through user actions, when it is no longer needed. Take care to think of how deleted data may still be present in archives and backups. You will also need to work with third parties whom you pass data to or receive it from, such as a SAAS or a cloud service, to ensure that a request for data deletion on your end also removes the data on their end, and to verify that this has been done.
I would love to see what an agreement for this looks like, when postgres, Cassandra etc. essentially use a append-only-log to capture new data
-
The European term “personal data” differs from the American term “personally identifiable information.” The latter pertains to a much more limited set of information than the European model. It also does not see information as contextual, whereas the European framework emphasizes the risks inherent in data aggregation.
Important distinction. This is a useful article
-
-
increment.com increment.com
-
Pollution, broadly, is the number one source of unrest and citizen dissatisfaction, and it’s actually an essential threat to the rule of the Chinese Communist Party because they have to do something about it to keep their people content.
Source of unrest too?
-
In the late 1970s, it cost $100 a watt for solar panel material, but the price has dropped 300-fold over the last 40 years. The first 100x price drop didn’t matter because solar was still more expensive than coal or gas. So all through that incredible price drop, people could say, “It’s a toy. It’s never going to make sense.”
TODO: Find the source for this quote
-
-
-
The AutoGrid Flex platform interfaces with a wide variety of IoT devices, from residential to industrial-scale energy applications. In addition to energy-consumption data, typical residential appliances may also provide telemetry about air temperature, humidity, water temperature, and occupancy. Industrial devices often generate a variety of interesting process-specific data, but some of the most common and useful measurements include wind speed, solar irradiance, and thermal limits. These data streams can be leveraged by the AutoGrid machine learning algorithms to enhance forecasting and optimization of flexible energy resources throughout the network.
-
If, for example, an OhmConnect consumer saves one kilowatt hour (kWh) of electricity, the California ISO will reward OhmConnect as if that consumer generated one kWh. OhmConnect in turn passes a significant portion of that savings to its end user.
-
Winn said that solar plant operators can also attach thermal cameras to drones to help identify solar cells that are less efficient, perhaps even broken: A solar cell that’s absorbing all the energy and producing electricity is going to be much cooler than one that is not.
-
The Heila IQ box runs powerful software that presents an abstract view to the operator. Instead of directly controlling the individual assets, the operator describes higher-level goals and constraints such as “reduce emissions” or “avoid using gas-based generators because they are expensive.” Then, as the microgrid is operating, the Heila IQ automatically controls the assets to try to optimize for these goals and satisfy the constraints. Later, if the operator adds new assets to the microgrid, they don’t need to configure the individual assets or try to rebalance the system. As long as they specify the higher-level goals and constraints, the Heila IQ-based microgrid continues to control the assets appropriately.
wow, this is possible now?
-
-
increment.com increment.com
-
Smaller data centers—servers stashed in closets or rooms in office buildings under 5,000 square feet—barely apply these efficiency strategies. That’s how small and medium-sized data centers end up consuming 49 percent of the electricity used in U.S. data centers each year, despite owning just 40 percent of the total number of servers, according to a 2014 report by the nonprofit Natural Resources Defence Council (NRDC).
The other argument for cloud. It's like running your own power station in a closet now, when you can pay for it on a meter
-
- Jan 2018
-
www.cennydd.com www.cennydd.com
-
No more retention scams that allow online signups but demand users phone a call centre to delete their accounts.
Holy caw, this covers opt-out after subscriptions too? Eeeenteresting...
-
-
-
Douglas Hofstadterobserved that, no matter how much work went into developing computer programs to play chess against Grand Masters, the winning program always seemed to be 10 years away
I didn't know this came from chess games
-
Slicing Features into separate functional parts helps us actively manage the scope by creating different implementation options 45that are often implicit and non-negotiable when we have larger Features in the backlog
On a second read through, this appears to be a key thing for this to work - decomposing larger units of work into smaller things, but not diving into the deliver-in-a-day scale of wehat he's referring to as user stories here
-
At the heart of the rolling wave forecast is the acceptance of uncertainty. This, in turn, allows us to keep our options open. To be flexible and able to respond to change –just like the Agile Manifesto says.
okay, this is a nice way to present it
-
what did you say about rolling wave forecast
Moar new terminilogy for me…
okay, so it feels a bit like a weird cross between a release plan and a roadmap
-
By forecasting and showing progress (or the lack thereof) very early on, Carmen is bringing the scope discussion to the first few days of the project.
So basically, acknowledge self-deception both parties played along with to get here as soon as you can, because it will come up either way
-
Move to Story Points. Even if this is just another way of estimating, getting rid of ‘hours’ and ‘days’ has too many benefits to ignore them. We already discussed previously in this book the problems of using time-based metrics, which are an indication of cost, to measure project progress. Even if it’s just another proxy metric for productivity, Story Point-based estimation gives a better understanding of how things like risk, complexity, expected dependencies for each Story, etc. Given that a large amount of time it takes to deliver one Story is spent waiting, Story Point estimation is more likely to help you assess the true impact of one Story in your project.
Surely when you have story points it's now really hard to compare across teams and projects though, right? A 3 pointer for one team is not a 3 pointer for another.
-
Mandating the maximum calendar duration for an item is also used for User Stories. In my practice I advise teams to have 1-day User Stories. The reason is simple. If you were wrong about the time it takes to develop your User Story you will know it already to
So this is similar to the idea in Reinertsen's book, when he describes the round robin approach if you can't reliably estimate work
-
Both these metrics will help you forecast the progress for your project. While the User Story velocity metric will help you assess when a certain Feature might be ready;; the Feature velocity will help you assess when the project might be ready.
This seems to assume that Carmen understands all the technology and the problem domain well enough to split a big feature into meaningful stories of more or less uniform size for devs to deliver. This feels like a different skill set to project management
-
In my research I’ve normally used the progress data from 3 to 5 iterations (or weeks if you are using Kanban/flow based software development) in order to define the initial progress rate. Many expect that you need many more data points before you can make a useful prediction, but that is not the case. Three to 5 iterations are typically enough
The German tank problem referenced as a justification for this is fascinating
-
“Absolutely correct! In fact you will not know how long the whole project will take until you have either the whole backlog of INVEST Stories sliced up (a bad idea) or until you have enough historical information that you can infer the cycle time for every backlog item, independently of size,” Herman explained
-
Early in each project, your top priority is not to ship something meaningful to your customer, but to obtain information on capacity, throughput, and bac
Okay, this is an interesting, and there's lots around about optimising for learning, but this is the first time I've seen it explicitly phrased like this
-
Even if each Story may not be “sellable”, it must be testable and final, i.e. the team can make sure that aparticular User Story has been successfully completed according to a Definition of Done. This Definition of Done is a litmus test that will allow you to classify tiny parts of the whole project as completed, before the whole project is done.
-
Each Story can be dropped from the project without affecting the overall project delivery.
This seems to contradict the earlier point about E meaning 'essential'. If I can drop a story then surely, it wasn't essential, right?
-
Essential, meaning that every story is absolutely required for the product to be viable. To be Essential, a story must not only be valuable, but it’s removal must make the product unusable or unsellable. Earlier INVEST definitions included ‘Estimatable’ in the sense that there would be some understanding and specific definition of the story that allowed us to cast an estimate if we wanted to. #NoEstimates focuses on value instead. The goal is to do only what is essential to the project’s success.
I'm struggling with this, as when you're making trade-offs between stories to work on in a given timebox, you'd be deliberately deciding not to have certain things that you've just deemed essential.
-
Gedanken or Gedankenexperiment. Ángel Medinilla, this book’s fantastic illustrator,
Ah, THAT'S where they came from
-
At Toyota, the production engineers would simultaneously start to design the production line and prepare manufacturing long before the new car designs were finished (hence, concurrent engineering), instead of waiting until all decisions about the design were closed. This, in the end, provided Japanese manufacturers with an astonishing competitive advantage that let them design and produce the Toyota Prius in about 3 years27, from idea to first sale!
Only 3 years? Cripes
-
Project Management Body of Knowledge (PMBO
AH, this is the PM Book he was mentioning last night
-
But for complex environments, where estimates come mostly from personal experience and knowledge, these estimates will be different for every person. Experts might estimate some work as easy and fast, while novices might estimate the same work as difficult and long lasting. Some team members may see risks and estimate the impact on the schedule, while others may ignore those risks.Hence, in most environments estimates are personal.
And presumably not comparable across teams then, if you're managing a portfolio of projects or products, and trying to work out where to focus your efforts?
-
So, if h(a) is much larger than g(e) the cost of a feature cannot be determined by relative estimation.In turn,this means that the most common estimation approach, Story Point estimation, cannot work reliably.
If this is the second 'social' complexity analysis, and it's a much larger factor, then I missed this part in the talk. Then again telling people to factor in how dysfunctional their org is might be a hard sell in an evening
-
Some researchers have alreadyproposedwhat a “good”estimate should be. In 19861, they proposed thata good estimation approach would provide estimates “within 25% of the actual result, 75% of the time”.
Okay, this figure is what we need to beat, with Reinertsen's cost of delay question, tracking the cost of the project being 60 days late
-
-
www.theguardian.com www.theguardian.com
-
Among the options available are two for missile alerts, according to the Washington Post. One is labelled “test missile alert”, which will test the notification system is working without actually sending an alert to the public.
Microcopy matters, yo.
-
-
inclusive-components.design inclusive-components.design
-
For a consistent experience between users, we need to be deliberate and focus() an appropriate element
Deliberate decisions about the next action with focus, provide a nicer UX
-
<use xlink:href="#bin-icon">
Ah… so THAT's what the hidden SVG at the beginning of the piece was fore
-
Many kinds of users often feel the need to scale/zoom interfaces, including the short-sighted and those with motor impairments who are looking to create larger touch or click targets.
Nice argument for leveling up in SVG
-
In this example, × is used to represent a cross symbol. Were it not for the aria-label overriding it, the label would be announced as “times” or “multiplication” depending on the screen reader in question.
So aria labels overrule clever submit typography. Userful to know
-
In my version, I just add a minor enhancement: a line-through style for checked items. This is applied to the <label> via the :checked state using an adjacent sibling combinator.
Clever CSS tricks abound in this piece
-
It’s quite valid in HTML to provide an <input> control outside of a <form> element. The <input> will not succeed in providing data to the server without the help of JavaScript, but that’s not a problem in an application using XHR.
Did not know this.
-
all the state information we need is actually already in the DOM, meaning all we need in order to switch between showing the list and showing the empty-state is CSS.
Wow. - never thought of this. It's not as obvious as the approach above though if you were working on the code base - how expensive is a check for todos.length?
-
If you do use a <section> element, you still need to provide a heading to it, otherwise it is an unlabeled section.
unexpected accessibility gotcha!
Tags
Annotators
URL
-
-
pwdless.github.io pwdless.github.io
-
Cierge sends a magic link as well as a magic code that a user can manually enter into the login screen to continue as an alternative to clicking the link. Magic codes are short, volatile, & memorable (eg. 443 863). For example, you can look up the code on your phone then enter it into your browser on desktop.
This is is the use case for magic codes
Tags
Annotators
URL
-
-
cloudinary.com cloudinary.com
-
Background CSS video & Responsive Video can now be a “thing”.
Urk
-
-
spectrum.chat spectrum.chat
-
This means our problem with 1% of requests, could affect 20% of pageviews (20 requests x 1% = 20% = ⅕). And 60% of users (3 pages x 20 objects x 1% = 60% ≈ ⅔).
This is one of the counter-intuitive thing about large numbers.
-
-
www.fairphone.com www.fairphone.com
-
For example, it’s easy to repair, we have a take back program and we’ve researched the best recycling methods. Some time back we also discussed alternative business models for consumers to incentivize take back.
When you place the incentives here, it's in Fairphone's interest to make it easy to service and fix. This is smart.I like.
-
-
citeseerx.ist.psu.edu citeseerx.ist.psu.edudownload1
-
Wireless communications has been recognized as akey enabler to the growth of the future economy. There is anunprecedented growth in data volume (10x in last 5 years) andassociated energy consumption (20%) in the Information andCommunications Technology (ICT) infrastructure.The challenge is how to: meet the exponential growth in datatraffic, deliver high-speed wide-area coverage to rural areas,whilst reducing the energy consumed. This paper focuses on thecellular wireless communication aspect, which constitutes approx-imately 11% of the ICT energy consumption. The paper showsthat with careful redesign of the cellular network architecture,up to 80% total energy can be saved. This is equivalent to saving500 TWh globally and 1.4 TWh in the United Kingdom.
Where is the date for this paper ?
-
-
www.aceee.org www.aceee.org
-
Data usage on the internet is estimated to be 20,151 PetaBytes per month (Cisco 2011). This is equivalent to 241 billion GB per year. Applying these figures to the average power estimate yields a figure of 5.12 kWh per GB.
Okay, so this is a top down figure, essentially dividing one huge number by another
-
An example transmission activity might begin on a desktop computer when an end user requests to download a song.
These next two paras explain pretty much the entire life cycle. Woot!
-
Many people are familiar with Moore’s law, which states that computational speeds are increasing at an exponential pace (Wikipedia 2012). There is also a corollary to this relationship known as Koomey’s law, which states that computational energy efficiency is also increasing at an exponential rate (Koomey 2009).
Koomey's law, the second new law I've come across this week after Wirth's law
-
Our major finding is that the Internet uses an average of about 5 kWh to support the utilization of every GB of data, which equates to about $0.51 of energy costs. Only 38% of those costs are borne by the end-user, while the remaining costs are thinly spread over the global Internet through which the data travels; in switches, routers, signal repeaters, servers, and data centers (See Figure 1 below). This creates a societal “tragedy of the commons,” where end users have little incentive to consider the other 62% of costs and associated resources.
5GW per GB in 2012 for the whole system
-
-
www.wildml.com www.wildml.com
-
Competition may also come from China, where hardware makers specializing in Bitcoin mining want to enter the Artificial Intelligence focused GPU space.
Weird crypo currency dividend. ML will get massively cheaper after the crash?
-
-
www.cr-report.telekom.com www.cr-report.telekom.com
-
Indeed, our national companies further increased their shares of electricity from renewable energy, coming to a total group-wide average of almost 33 percent by the end of 2016.
Is there already a list of all the mobile providers and the energy mix they use?
-
-
www.airbus.com www.airbus.com
-
The E-Fan X hybrid-electric technology demonstrator is anticipated to fly in 2020 following a comprehensive ground test campaign, provisionally on a BAe 146 flying testbed, with one of the aircraft’s four gas turbine engines replaced by a two megawatt electric motor. Provisions will be made to replace a second gas turbine with an electric motor once system maturity has been proven.
I wonder what kind of range this would offer
-
-
-
The lower the frequency of the band the further it can travel, so the 800MHz band is the most adept of the three at travelling over long distances, which means users can get a 4G signal even when they’re a long way from a mast. This becomes particularly useful in rural areas where masts are likely to be quite spread out.
Hmm? I assumed 4G was lower range than 3G. From what I read here, 4G can work at a longer range, with lower capacity, scenarios and work at shorter range, high capacity scenarios.
But only if the cell phone provider has both low frequencies and high frequencies
-
-
eta.lbl.gov eta.lbl.gov
-
Infrastructure Electricity Use for All Scenarios
From ~32 to ~7 Billion KWh per year for infra
-
The impact on the installed base in hyperscale data centers is smaller since, on average, one server in a hyperscale data center can replace 3.75 servers in non-hyperscale data centers. This is because servers in hyperscale data centers are assumed to run at roughly 3 times the utilization of non-hyperscale data centers and have no redundancy requirements.
-
The formulas in the “Redundancy” column represent the total number of servers needed for a data center containing N functional servers. For example, redundancy of “N+1” means that there is one redundant server present in each data center, while redundancy of “N+0.1N” means that there is one redundant server for every 10 functional servers. For data centers where the number of redundant servers scales with server count (i.e. closets, mid-tier, and high-end enterprise), consolidation of servers reduces the number of redundant servers required.
I'm not sure how the design for failure approach fits into this - it's an implicitly higher N, as you typically build in redundancy at the application level instead
-
The percent decrease in service provider data centers is assumed to be smaller because these data centers tend to have a lower rate of inactive servers due to better management practices that avoid the institutional problems of dispersed responsibility between IT and facility departments which often plagues internal data centers.
Basically, cloud gets better efficiency because they have a very good direct reason to do so
-
Infrastructure savings result from the reduced amount of IT equipment that require cooling and electrical services as well as the decrease in industry-wide average PUE, brought down by the growth in data centers with very low PUE values (i.e., hyperscale data centers).
Where we would be without late state surveillance capitalism industrialising servers
-
Historical Data Center Total Electricity Use
So, in the US at least, and according to this report, it's not as gloomy as it looked before
-
Total Electricity Consumption by Technology Type
First graph I've seen showing the breakdown by tech type. Infra here presumably means HVAC and the like?
-
PUE by Space Type
Handy table
-
Consequently, smaller data centers are still being measured with PUE values greater than 2.037 while large hyperscale cloud data centers are beginning to record PUE value of 1.1 or less.
-
Total Server Installed Base by Data Center Space Category
Everything stable apart from explosive growth in cloud
-
Total U.S. Data Center Network Equipment Electricity Consumption
When I look at this graph, it looks like energy efficiency is outpacing network traffic growth - at least over wired connections
-
Total U.S. Data Center Storage Electricity Consumption
A disks get larger and large, you need fewer of them, and because you have few drives to power, the total energy usage falls
-
The values shown in Table 1 represent the average of active servers, and therefore the inclusion of inactive servers (assumed to be 10% of internal and 5% of service provider and hyperscale data centers) slightly lowers the overall averages.
15-45% difference assumed based on how industrialised the data center is
-
Volume Server Installed Base 2000-2020
This graph shows the projected growth between cloud. non-branded servers, and non-cloud, branded servers really well
-
Observation of data showed that for any given year, the number of servers in the installed base was more than the sum of the previous 4 years’ shipments, but less than the previous 5.
-
Similar to previous U.S. data center energy estimates,12345 this study uses data provided by the market research firm International Data Corporation (IDC) to derive numbers of data center servers, as well as storage and network equipment, installed in the United States. Power draw assumptions are then applied to the estimated installed base of equipment to determine overall IT equipment energy consumption.
Does IDC publish this data anywhere or it is all private?
-
Figure ES-1 shows that these five scenarios yield an annual saving in 2020 up to 33 billion kWh, representing a 45% reduction in electricity demand when compared to current efficiency trends.
That graph shows the cumulative advantages, and you can see the impact cloud (i.e. hyper scale DC's) has
-
The resulting electricity demand, shown in Figure ES-1, indicates that more than 600 additional billion kWh would have been required across the decade.
How much electricity use has been avoided thanks for energy saving measures since 2010
-
From 2000-2005, server shipments increased by 15% each year resulting in a near doubling of servers operating in data centers. From 2005-2010, the annual shipment increase fell to 5%, partially driven by a conspicuous drop in 2009 shipments (most likely from the economic recession), as well as from the emergence of server virtualization across that 5-year period. The annual growth in server shipments further dropped after 2010 to 3% and that growth rate is now expected to continue through 2020.
Virtualisation and move to the cloud means small scale inefficient DCs are less common now?
-
- Dec 2017
-
www.tech-pundit.com www.tech-pundit.com
-
The new high-‐speed LTE networks that accelerate themobile Internetrequireup to three times more data per hour per task compared to the previousslower 3G networks, and thus more energy.43And compared to 2G networks, LTEenergy consumption is 60 times greaterto offer the samecoverage.
Holy biscuits. 4G is 3 times as much for as 3G per hour, which is in turn 20 times more than 2G for the same area.
And 5G is even shorter range than 4G, meaning you need many more transmitters 0_o
-
-
firebase.google.com firebase.google.com
-
You can use the value event to read a static snapshot of the contents at a given path, as they existed at the time of the event. This method is triggered once when the listener is attached and again every time the data, including children, changes. The event callback is passed a snapshot containing all data at that location, including child data.
So adding a ref too close to the root means the entire snapshop is sent, not just the diff
-
-
projectsbyif.com projectsbyif.com
-
Projects by IF is a limited company based in London, England. We run this website (projectsbyif.com) and its subdomains. We also use third party services to publish work, keep in touch with people and understand how we can do those things better. Many of those services collect some data about people who are interested in IF, come to our events or work with us. Here you can find out what those services are, how we use them and how we store the information they collect. If you’ve got any questions, or want to know more about data we might have collected about you, email hello@projectsbyif.com This page was published on 25 August 2017. You can see any revisions by visiting the repository on Github.
As you'd expect, If's privacy page is fantastic
Tags
Annotators
URL
-
- Nov 2017
-
-
A Chinese company has built a 2,000 metric-ton (2,204 tons) all-electric cargo ship, which was launched from the southern Chinese city of Guangzhou in mid-November, according to state-run newspaper People’s Daily.
How does this compare to the ships of the kind produced by Maersk?
-
-
ourworldindata.org ourworldindata.org
-
Average land use area needed to produce one unit of protein by food type, measured in metres squared (m²) per gram of protein over a crop'sannual cycle or the average animal's lifetime. Average values are based on a meta-analysis of studies across 742 agricultural systems andover 90 unique foods.
Beef is nearly 6 times the impact of Pork.
This is worth referring to in the background section to provide context, on why you need more than just changes to the web
-
-
www.nytimes.com www.nytimes.com
-
We measured the mix of advertising and editorial on the mobile home pages of the top 50 news websites – including ours – and found that more than half of all data came from ads and other content filtered by ad blockers. Not all of the news websites were equal.
This has some good stats on different news pages
-
-
www.irishtimes.com www.irishtimes.com
-
It was also achieved with the support of some of Scottish biggest industries including the whiskey industry.
The whiskey industry? Was there a campaign to get behind renewables?
-
-
-
This is not good news. It felt like we had reached a turning point, and it turns out not to be the case.
-