- Apr 2020
Once this securitization accelerates, ARR securities will become the next bond-like asset class for both institutions and individuals – irresponsible not to have some in your portfolio, as a fixed income product and a balance against equities. And who will make this market? Sand Hill Sachs.
"As Alex Danco highlighted in his recent article Debt is Coming, it is clear that recurring revenue securitization – the notion of selling your future ARR bookings at a discount – is the future. "
Federalism => competition between states => one (small) state, S. Dakota, abolishing long-standing laws that prevented usury => growth of trillion dollar credit card debt industry
- But when state leaders, desperate to attract outside businesses during the economic recession of the early 1980s, changed South Dakota's usury laws to eliminate the cap on interest rates and fees, Citibank came calling...The deal was breathtakingly quid pro quo, with then-Gov. Bill Janklow's chief-of-staff leaving to become president and CEO of Citibank South Dakota.
AND then the abolishing of laws against perpetuities
In 1983, he abolished the rule against perpetuities and, from that moment on, property placed in trust in South Dakota would stay there for ever. A rule created by English judges after centuries of consideration was erased by a law of just 19 words. Aristocracy was back in the game.
The Tax Justice Network (TJN) still ranks Switzerland as the most pernicious tax haven in the world in its Financial Secrecy Index, but the US is now in second place and climbing fast, having overtaken the Cayman Islands, Hong Kong and Luxembourg since Fatca was introduced. “While the United States has pioneered powerful ways to defend itself against foreign tax havens, it has not seriously addressed its own role in attracting illicit financial flows and supporting tax evasion,” said the TJN in the report accompanying the 2018 index. In just three years, the amount of money held via secretive structures in the US had increased by 14%, the TJN said.
Here is an example from one academic paper on South Dakotan trusts: after 200 years, $1m placed in trust and growing tax-free at an annual rate of 6% will have become $136bn. After 300 years, it will have grown to $50.4tn. That is more than twice the current size of the US economy, and this trust will last for ever, assuming that society doesn’t collapse altogether under the weight of this ever-swelling leach.
If the richest members of society are able to pass on their wealth tax-free to their heirs, in perpetuity, then they will keep getting richer than those of us who can’t. In fact, the tax rate for everyone else will probably have to rise, to make up for the shortfall caused by the wealthiest members of societies opting out, which will just make the problem worse. Eric Kades, the law professor at William & Mary Law School, thinks that South Dakota’s decision to abolish the rule against perpetuities for the short term benefit of its economy will prove to have been a long-term catastrophe. “In 50 or 100 years, it will turn out to have been an absolute disaster,” said Kades. “Now we’re going to have a bunch of wealthy families, and no one will be able to piss away that wealth, it will stay in the family for ever. This just locks in advantage.”
All effected by one man
'Summary' (still 30m of reading LOL) of Sapiens; tl;dr: cooking + language + imagination + industrialization => progress/ society => collapse of the family
Some scholars believe there is a direct link between the advent of cooking, the shortening of the human intestinal track, and the growth of the human brain. Since long intestines and large brains are both massive energy consumers, it’s hard to have both. By shortening the intestines and decreasing their energy consumption, cooking inadvertently opened the way to the jumbo brains.
Yet the truly unique feature of our language is not its ability to transmit information about men and lions. Rather, it’s the ability to transmit information about things that do not exist at all.
The first millennium BC witnessed the appearance of three potentially universal orders, whose devotees could for the first time imagine the entire world and the entire human race as a single unit governed by a single set of laws. Everyone was ‘us’, at least potentially. There was no longer ‘them’.
The first universal order to appear was economic: the monetary order.
The second universal order was political: the imperial order.
The third universal order was religious: the order of universal religions such as Buddhism, Christianity and Islam.
Since all social orders and hierarchies are imagined, they are all fragile, and the larger the society, the more fragile it is. The crucial historical role of religion has been to give superhuman legitimacy to these fragile structures. Religions assert that our laws are not the result of human caprice, but are ordained by an absolute and supreme authority. This helps place at least some fundamental laws beyond challenge, thereby ensuring social stability.
The Industrial Revolution turned the timetable and the assembly line into a template for almost all human activities. Shortly after factories imposed their time frames on human behaviour, schools too adopted precise timetables, followed by hospitals, government offices and grocery stores. Even in places devoid of assembly lines and machines, the timetable became king. If the shift at the factory ends at 5 p.m., the local pub had better be open for business by 5:02.
This modest beginning spawned a global network of timetables, synchronised down to the tiniest fractions of a second. When the broadcast media – first radio, then television – made their debut, they entered a world of timetables and became its main enforcers and evangelists.
The state and the market approached people with an offer that could not be refused. ‘Become individuals,’ they said. ‘Marry whomever you desire, without asking permission from your parents. Take up whatever job suits you, even if community elders frown. Live wherever you wish, even if you cannot make it every week to the family dinner. You are no longer dependent on your family or your community. We, the state and the market, will take care of you instead. We will provide food, shelter, education, health, welfare and employment. We will provide pensions, insurance and protection.’
"What if most rich assholes are made, not born?"
What if the cold-heartedness so often associated with the upper crust—let's call it Rich Asshole Syndrome—isn’t the result of having been raised by a parade of resentful nannies, too many sailing lessons, or repeated caviar overdoses, but the compounded disappointment of being lucky but still feeling unfulfilled? We’re told that those with the most toys are winning, that money represents points on the scoreboard of life. But what if that tired story is just another facet of a scam in which we’re all getting ripped off?
In New York, I’d developed psychological defenses against the desperation I saw in the streets. I told myself that there were social services for homeless people, that they would just use my money to buy drugs or booze, that they’d probably brought their situation on themselves. But none of that worked with these Indian kids. There were no shelters waiting to receive them. I saw them sleeping in the streets at night, huddled together for warmth, like puppies. They weren’t going to spend my money unwisely. They weren’t even asking for money. They were just staring at my food like the starving creatures they were.
The social distance separating rich and poor, like so many of the other distances that separate us from each other, only entered human experience after the advent of agriculture and the hierarchical civilizations that followed, which is why it’s so psychologically difficult to twist your soul into a shape that allows you to ignore starving children standing close enough to smell your plate of curry. You’ve got to silence the inner voice calling for justice and for fairness. But we silence this ancient, insistent voice at great cost to our own psychological well-being.
When volunteers in their studies placed the interests of others before their own, a primitive part of the brain normally associated with food or sex was activated. When researchers measured vagal tone (an indicator of feeling safe and calm) in 74 preschoolers, they found that children who’d donated tokens to help sick kids had much better readings than those who’d kept all their tokens for themselves. Jonas Miller, the lead investigator, said that the findings suggested “we might be wired from a young age to derive a sense of safety from providing care for others.”
Psychologists Dacher Keltner and Paul Piff monitored intersections with four-way stop signs and found that people in expensive cars were four times more likely to cut in front of other drivers, compared to folks in more modest vehicles. When the researchers posed as pedestrians waiting to cross a street, all the drivers in cheap cars respected their right of way, while those in expensive cars drove right on by 46.2 percent of the time, even when they’d made eye contact with the pedestrians waiting to cross. Other studies by the same team showed that wealthier subjects were more likely to cheat at an array of tasks and games. For example, Keltner reported that wealthier subjects were far more likely to claim they’d won a computer game—even though the game was rigged so that winning was impossible. Wealthy subjects were more likely to lie in negotiations and excuse unethical behavior at work, like lying to clients in order to make more money. When Keltner and Piff left a jar of candy in the entrance to their lab with a sign saying whatever was left over would be given to kids at a nearby school, they found that wealthier people stole more candy from the babies.
Books such as Snakes in Suits: When Psychopaths Go to Work and The Psychopath Test argue that many traits characteristic of psychopaths are celebrated in business: ruthlessness, a convenient absence of social conscience, a single-minded focus on “success.” But while psychopaths may be ideally suited to some of the most lucrative professions, I’m arguing something different here. It’s not just that heartless people are more likely to become rich. I’m saying that being rich tends to corrode whatever heart you’ve got left. I’m suggesting, in other words, that it’s likely the wealthy subjects who participated in Muscatell’s study learned to be less unsettled by the photos of sick kids by the experience of being rich—much as I learned to ignore starving children in Rajastan so I could comfortably continue my vacation.
“What we’ve been finding across dozens of studies and thousands of participants across this country,” said Piff, “is that as a person’s levels of wealth increase, their feelings of compassion and empathy go down, and their feelings of entitlement, of deservingness, and their ideology of self-interest increases.”
Institutions seeking to justify a fundamentally anti-human economic system constantly rebroadcast the message that winning the money game will bring satisfaction and happiness. But we’ve got around 300,000 years of ancestral experience telling us it just isn’t so. Selfishness may be essential to civilization, but that only raises the question of whether a civilization so out of step with our evolved nature makes sense for the human beings within it.
An insight into how social networks work
Let's begin with two principles: (1)People are status-seeking monkeys, (2) People seek out the most efficient path to maximizing social capital.
Why do some large social networks suddenly fade away, or lose out to new tiny networks? What ties many of these explanations together is social capital theory. Classic network effects theory [that a network’s utility increases with the number of users who use it] still holds, I’m not discarding it. Instead, let's append some social capital theory. Together, those form the two axes on which I like to analyze social network health. When modeling how successful social networks create a status game worth playing, a useful metaphor is one of the trendiest technologies: cryptocurrency.
How is a new social network analogous to an ICO?
(1) Each new social network issues a new form of social capital, a token.
(2) You must show proof of work to earn the token.
(3) Over time it becomes harder and harder to mine new tokens on each social network, creating built-in scarcity.
(4) Many people, especially older folks, scoff at both social networks and cryptocurrencies.
Perhaps you've read a long and thoughtful response by a random person on Quora or Reddit, or watched YouTube vloggers publishing night after night, or heard about popular Vine stars living in houses together, helping each other shoot and edit 6-second videos. While you can outsource Bitcoin mining to a computer, people still mine for social capital on social networks largely through their own blood, sweat, and tears.
Almost every social network of note had an early signature proof of work hurdle. For Facebook it was posting some witty text-based status update. For Instagram, it was posting an interesting square photo. For Vine, an entertaining 6-second video. For Twitter, it was writing an amusing bit of text of 140 characters or fewer. Pinterest? Pinning a compelling photo. You can likely derive the proof of work for other networks like Quora and Reddit and Twitch and so on. Successful social networks don't pose trick questions at the start, it’s usually clear what they want from you.
If you've ever joined one of these social networks early enough, you know that, on a relative basis, getting ahead of others in terms of social capital (followers, likes, etc.) is easier in the early days. Some people who were featured on recommended follower lists in the early days of Twitter have follower counts in the 7-figures, just as early masters of Musical.ly and Vine were accumulated massive and compounding follower counts. The more people who follow you, the more followers you gain because of leaderboards and recommended follower algorithms and other such common discovery mechanisms.
Young people, with their much higher usage rate on social media, are the most sensitive and attuned demographic to the payback period and ROI on their social media labor. So, for example, young people tend not to like Twitter but do enjoy Instagram.
It's not that Twitter doesn't dole out the occasional viral supernova; every so often someone composes a tweet that goes over 1K and then 10K likes or retweets (Twitter should allow people to buy a framed print of said tweet with a silver or gold 1K club or 10K club designation to supplement its monetization). But it’s not common, and most tweets are barely seen by anyone at all. Pair that with the fact that young people's bias towards and skill advantage in visual mediums over textual ones and it's not surprising Instagram is their social battleground of preference (video games might be the most lucrative battleground for the young if you broaden your definition of social networks, and that's entirely reasonable, though that arena skews male).
The gradient of your network's social capital ROI can often govern your market share among different demographics. Young girls flocked to Musical.ly in its early days because they were uniquely good at the lip synch dance routine videos that were its bread and butter. In this age of neverending notifications, heavy social media users are hyper aware of differing status ROI among the apps they use.
TikTok is an interesting new player in social media because its default feed, For You, relies on a machine learning algorithm to determine what each user sees; the feed of content from by creators you follow, in contrast, is hidden one pane over. If you are new to TikTok and have just uploaded a great video, the selection algorithm promises to distribute your post much more quickly than if you were on sharing it on a network that relies on the size of your following, which most people have to build up over a long period of time. Conversely, if you come up with one great video but the rest of your work is mediocre, you can't count on continued distribution on TikTok since your followers live mostly in a feed driven by the TikTok algorithm, not their follow graph.
Why copying proof of work is lousy strategy for status-driven networks… Most clones have and will fail. The reason that matching the basic proof of work hurdle of an Status as a Service incumbent fails is that it generally duplicates the status game that already exists. By definition, if the proof of work is the same, you're not really creating a new status ladder game, and so there isn't a real compelling reason to switch when the new network really has no one in it.
I once wrote about social networks that the network's the thing; that is, the composition of the graph once a social network reaches scale is its most unique quality. Copying some network's feature often isn’t sufficient if you can’t also copy its graph, but if you can apply the feature to some unique graph that you earned some other way, it can be a defensible advantage. Nothing illustrates this better than Facebook's attempts to win back the young from Snapchat by copying some of the network's ephemeral messaging features, or Facebook's attempt to copy TikTok with Lasso, or, well Facebook's attempt to duplicate just about every social app with any traction anywhere. The problem with copying Snapchat is that, well, the reason young people left Facebook for Snapchat was in large part because their parents had invaded Facebook. You don't leave a party with your classmates to go back to one your parents are throwing just because your dad brings in a keg and offer to play beer pong.
I think the Stories format is a genuine innovation on the social modesty problem of social networks. That is, all but the most egregious showoffs feel squeamish about publishing too much to their followers. Stories, by putting the onus on the viewer to pull that content, allows everyone to publish away guilt-free, without regard for the craft that regular posts demand in the ever escalating game that is life publishing. In a world where algorithmic feeds break up your sequence of posts, Stories also allow gifted creators to create sequential narratives. In the annals of tech, and perhaps the world, the event that created the greatest social capital boom in history was the launch of Facebook's News Feed. Before News Feed, if you were on, say MySpace, or even on a Facebook before News Feed launched, you had to browse around to find all the activity in your network. Only a demographic of a particular age will recall having to click from one profile to another on MySpace while stalking one’s friends. It almost seems comical in hindsight, that we'd impose such a heavy UI burden on social media users. Can you imagine if, to see all the new photos posted in your Instagram network, you had to click through each profile one by one to see if they’d posted any new photos? I feel like my parents talking about how they had to walk miles to grade school through winter snow wearing moccasins of tree bark when I complain about the undue burden of social media browsing before the News Feed, but it truly was a monumental pain in the ass.
By merging all updates from all the accounts you followed into a single continuous surface and having that serve as the default screen, Facebook News Feed simultaneously increased the efficiency of distribution of new posts and pitted all such posts against each other in what was effectively a single giant attention arena, complete with live updating scoreboards on each post. It was as if the panopticon inverted itself overnight, as if a giant spotlight turned on and suddenly all of us performing on Facebook for approval realized we were all in the same auditorium, on one large, connected infinite stage, singing karaoke to the same audience at the same time.
It's difficult to overstate what a momentous sea change it was for hundreds of millions, and eventually billions, of humans who had grown up competing for status in small tribes, to suddenly be dropped into a talent show competing against EVERY PERSON THEY HAD EVER MET.
Incidentally, teens and twenty-somethings, more so than the middle-aged and elderly, tend to juggle more identities. In middle and high school, kids have to maintain an identity among classmates at school, then another identity at home with family. Twenty-somethings craft one identity among coworkers during the day, then another among their friends outside of work. Often those spheres have differing status games, and there is some penalty to merging those identities. Anyone who has ever sent a text meant for their schoolmates to their parents, or emailed a boss or coworker something meant for their happy hour crew knows the treacherous nature of context collapse.
Add to that this younger generation's preference for and facility with visual communication and it's clearly why the preferred social network of the young is Instagram and the preferred messenger Snapchat, both preferable to Facebook. Instagram because of the ease of creating multiple accounts to match one's portfolio of identities, Snapchat for its best in class ease of visual messaging privately to particular recipients. The expiration of content, whether explicitly executed on Instagram (you can easily kill off a meme account after you've outgrown it, for example), or automatically handled on a service like Snapchat, is a must-have feature for those for whom multiple identity management is a fact of life.
Many types of social capital have qualities which render them fragile. Status relies on coordinated consensus to define the scarcity that determines its value. Consensus can shift in an instant. Recall the friend in Swingers, who, at every crowded LA party, quips, "This place is dead anyway." Or recall the wise words of noted sociologist Groucho Marx: "I don't care to belong to any club that will have me as a member."
The Groucho Marx effect doesn't take effect immediately. In the beginning, a status hierarchy requires lower status people to join so that the higher status people have a sense of just how far above the masses they reside. It's silly to order bottle service at Hakkasan in Las Vegas if no one is sitting on the opposite side of the velvet ropes; a leaderboard with just a single high score is meaningless.
Snapchat— Snapchat opens to a camera. If you want to text someone, it's extra work to swipe to the left pane to reach the text messaging screen. Remember Snapchat's original Best Friends list? I'm going to guess many of my readers don't, because, as noted earlier, old people probably didn't play that status game, if they'd even figured out how to use Snapchat by that point. This was just about as pure a status game feature as could be engineered for teens. Not only did it show the top three people you Snapped with most frequently, you could look at who the top three best friends were for any of your contacts. Essentially, it made the hierarchy of everyone's “friendships” public, making the popularity scoreboard explicit.
As with aggregate follower counts and likes, the Best Friends list was a mechanism for people to accumulate a very specific form of social capital. From a platform perspective, however, there's a big problem with this feature: each user could only have one best friend. It put an artificial ceiling on the amount of social capital one could compete for and accumulate. In a clever move to unbound social capital accumulation and to turn a zero-sum game into a positive sum game, broadening the number of users working hard or engaging, Snapchat deprecated the very popular Best Friends list and replaced it with streaks.
Social Arbitrage— Because social networks often attract different audiences, and because the configuration of graphs even when there are overlapping users often differ, opportunities exist to arbitrage social capital across apps. A prominent user of this tactic was @thefatjewish, the popular Instagram account (his real name was Josh Ostrovsky). He accumulated millions of followers on Instagram in large part by taking other people's jokes from Twitter and other social networks and then posting them as his own on Instagram. Not only did he rack up followers and likes by the millions, he even got signed with CAA! When he got called on it, he claimed it wasn't what he was about. He said, "Again, Instagram is just part of a larger thing I do. I have an army of interns working out of the back of a nail salon in Queens. We have so much stuff going on: I'm writing a book, I've got rosé. I need them to bathe me. I've got so many other things that I need them to do. It just didn't seem like something that was extremely dire." Which is really a long, bizarre way of saying, you caught me. Let he who does not have an army of interns bathing them throw the first stone.
Whenever I think of entrepreneurship, I'm drawn to a few of the slides (23, 24, 34) from Eric Schmidt's "How Google Works"
First you have to attract your smart creatives. They aren't easily fooled.
This starts with culture. Smart creatives need to care about the place they work.
Never forget that hiring is the most important thing you do.
I also think back to how Jack Welch was famous (ignoring what he was infamous / the final performance of GE) for was spending more than 50% of his time:
getting the right people in the right places and then helping them to thrive. He would involve himself in hiring decisions that most global CEOs would delegate.
Taken from a graduation address delivered at West Point, which is just so good and worth quoting many sections of at length
That’s really the great mystery about bureaucracies. Why is it so often that the best people are stuck in the middle and the people who are running things—the leaders—are the mediocrities? Because excellence isn’t usually what gets you up the greasy pole. What gets you up is a talent for maneuvering. Kissing up to the people above you, kicking down to the people below you. Pleasing your teachers, pleasing your superiors, picking a powerful mentor and riding his coattails until it’s time to stab him in the back. Jumping through hoops. Getting along by going along. Being whatever other people want you to be, so that it finally comes to seem that, like the manager of the Central Station, you have nothing inside you at all. Not taking stupid risks like trying to change how things are done or question why they’re done. Just keeping the routine going.
We have a crisis of leadership in America because our overwhelming power and wealth, earned under earlier generations of leaders, made us complacent, and for too long we have been training leaders who only know how to keep the routine going. Who can answer questions, but don’t know how to ask them. Who can fulfill goals, but don’t know how to set them. Who think about how to get things done, but not whether they’re worth doing in the first place. What we have now are the greatest technocrats the world has ever seen, people who have been trained to be incredibly good at one specific thing, but who have no interest in anything beyond their area of expertise. What we don’t have are leaders.
What we don’t have, in other words, are thinkers. People who can think for themselves. People who can formulate a new direction: for the country, for a corporation or a college, for the Army—a new way of doing things, a new way of looking at things. People, in other words, with vision.
That’s the first half of the lecture: the idea that true leadership means being able to think for yourself and act on your convictions. But how do you learn to do that? How do you learn to think? Let’s start with how you don’t learn to think. A study by a team of researchers at Stanford came out a couple of months ago. The investigators wanted to figure out how today’s college students were able to multitask so much more effectively than adults. How do they manage to do it, the researchers asked? The answer, they discovered—and this is by no means what they expected—is that they don’t. The enhanced cognitive abilities the investigators expected to find, the mental faculties that enable people to multitask effectively, were simply not there. In other words, people do not multitask effectively. And here’s the really surprising finding: the more people multitask, the worse they are, not just at other mental abilities, but at multitasking itself.
One thing that made the study different from others is that the researchers didn’t test people’s cognitive functions while they were multitasking. They separated the subject group into high multitaskers and low multitaskers and used a different set of tests to measure the kinds of cognitive abilities involved in multitasking. They found that in every case the high multitaskers scored worse. They were worse at distinguishing between relevant and irrelevant information and ignoring the latter. In other words, they were more distractible. They were worse at what you might call “mental filing”: keeping information in the right conceptual boxes and being able to retrieve it quickly. In other words, their minds were more disorganized. And they were even worse at the very thing that defines multitasking itself: switching between tasks.
Concentrating, focusing. You can just as easily consider this lecture to be about concentration as about solitude. Think about what the word means. It means gathering yourself together into a single point rather than letting yourself be dispersed everywhere into a cloud of electronic and social input. It seems to me that Facebook and Twitter and YouTube—and just so you don’t think this is a generational thing, TV and radio and magazines and even newspapers, too—are all ultimately just an elaborate excuse to run away from yourself. To avoid the difficult and troubling questions that being human throws in your way. Am I doing the right thing with my life? Do I believe the things I was taught as a child? What do the words I live by—words like duty, honor, and country—really mean? Am I happy?
So it’s perfectly natural to have doubts, or questions, or even just difficulties. The question is, what do you do with them? Do you suppress them, do you distract yourself from them, do you pretend they don’t exist? Or do you confront them directly, honestly, courageously? If you decide to do so, you will find that the answers to these dilemmas are not to be found on Twitter or Comedy Central or even in The New York Times. They can only be found within—without distractions, without peer pressure, in solitude.
“Your own reality—for yourself, not for others.” Thinking for yourself means finding yourself, finding your own reality. Here’s the other problem with Facebook and Twitter and even The New York Times. When you expose yourself to those things, especially in the constant way that people do now—older people as well as younger people—you are continuously bombarding yourself with a stream of other people’s thoughts. You are marinating yourself in the conventional wisdom. In other people’s reality: for others, not for yourself. You are creating a cacophony in which it is impossible to hear your own voice, whether it’s yourself you’re thinking about or anything else. That’s what Emerson meant when he said that “he who should inspire and lead his race must be defended from travelling with the souls of other men, from living, breathing, reading, and writing in the daily, time-worn yoke of their opinions.” Notice that he uses the word lead. Leadership means finding a new direction, not simply putting yourself at the front of the herd that’s heading toward the cliff.
So solitude can mean introspection, it can mean the concentration of focused work, and it can mean sustained reading. All of these help you to know yourself better. But there’s one more thing I’m going to include as a form of solitude, and it will seem counterintuitive: friendship. Of course friendship is the opposite of solitude; it means being with other people. But I’m talking about one kind of friendship in particular, the deep friendship of intimate conversation. Long, uninterrupted talk with one other person. Not Skyping with three people and texting with two others at the same time while you hang out in a friend’s room listening to music and studying. That’s what Emerson meant when he said that “the soul environs itself with friends, that it may enter into a grander self-acquaintance or solitude.”
Introspection means talking to yourself, and one of the best ways of talking to yourself is by talking to another person. One other person you can trust, one other person to whom you can unfold your soul. One other person you feel safe enough with to allow you to acknowledge things—to acknowledge things to yourself—that you otherwise can’t. Doubts you aren’t supposed to have, questions you aren’t supposed to ask. Feelings or opinions that would get you laughed at by the group or reprimanded by the authorities.
This is what we call thinking out loud, discovering what you believe in the course of articulating it. But it takes just as much time and just as much patience as solitude in the strict sense. And our new electronic world has disrupted it just as violently. Instead of having one or two true friends that we can sit and talk to for three hours at a time, we have 968 “friends” that we never actually talk to; instead we just bounce one-line messages off them a hundred times a day. This is not friendship, this is distraction.