The stories in Broken Stars span from short-shorts to novellas, evoking every hue on the emotional spectrum, and demonstrating the vibrancy and diversity of science fiction being written in China.
The anthology features works of hard science fiction, cyberpunk, science fantasy, and space opera, as well as genres with deeper ties to Chinese culture: alternate Chinese history, chuanyue time travel, satire with historical and contemporary allusions.
Some of the included authors are already familiar to readers in the West (Liu Cixin and Hao Jingfang, both Hugo winners); some are publishing in English for the first time.
In addition, three essays at the end of the book explore the history of Chinese science fiction publishing, the state of contemporary Chinese fandom, and how the growing interest in science fiction in China has impacted writers who had long labored in obscurity.
“Goodnight, Melancholy” by Xia Jia
“The Snow of Jinyang” by Zhang Ran
“Broken Stars” by Tang Fei
“Submarines” by Han Song
“Salinger and the Koreans” by Han Song
“Under a Dangling Sky” by Cheng Jingbo
“What Has Passed Shall in Kinder Light Appear” by Baoshu
“The New Year Train” by Hao Jingfang
“The Robot Who Liked to Tell Tall Tales” by Fei Dao
“Moonlight” by Liu Cixin
“The Restaurant at the End of the Universe: Laba Porridge" by Anna Wu
“The First Emperor’s Games” by Ma Boyong
“Reflection” by Gu Shi
“The Brain Box” by Regina Kanyu Wang
“Coming of the Light” by Chen Qiufan
“A History of Future Illnesses” by Chen Qiufan
“A Brief Introduction to Chinese Science Fiction and Fandom,” by Regina Kanyu Wang,
“A New Continent for China Scholars: Chinese Science Fiction Studies” by Mingwei Song
“Science Fiction: Embarrassing No More” by Fei Dao
For more Chinese SF in translation, check out Invisible Planets.
|Publisher:||Tom Doherty Associates|
|Product dimensions:||5.40(w) x 8.30(h) x 1.80(d)|
About the Author
Read an Excerpt
As an undergraduate, Xia Jia (a pen name that should be treated as an indivisible unit) majored in Atmospheric Sciences at Peking University. She then entered the Film Studies Program at the Communication University of China, where she completed her master's thesis: "A Study on Female Figures in Science Fiction Films." Later, she obtained a Ph.D. in Comparative Literature and World Literature at Peking University, with "Fear and Hope in the Age of Globalization: Contemporary Chinese Science Fiction and Its Cultural Politics (1991–2012)" as the title of her dissertation. She now teaches at Xi'an Jiaotong University.
She has been publishing fiction since college in a variety of venues, including Science Fiction World and Jiuzhou Fantasy. Several of her stories have won the Yinhe ("Galaxy") Award and Xingyun ("Nebula") Award, China's most prestigious science fiction honors. In English translation, she has been published in Clarkesworld and Upgraded. Her first story written in English, "Let's Have a Talk," was published in Nature in 2015.
"Goodnight, Melancholy" won the 2016 Yinhe Award. Like much of Xia Jia's recent fiction, it belongs to a loosely connected series called "The Chinese Encyclopedia." These stories take place in the same near-future universe, where ubiquitous AI, VR, AR, and other technologies present age-old questions about how and why we remain human in new forms, and tradition and modernity are not simple binary opposites, but partners in a complicated dance.
More of Xia Jia's fiction and nonfiction may be found in Invisible Planets.CHAPTER 2
I remember the first time Lindy walked into my home.
She lifted her tiny feet and set them down gingerly on the smooth, polished wooden floor, like a child venturing onto freshly fallen snow: trembling, hesitating, afraid to dirty the pure white blanket, terrified of sinking into and disappearing beneath the featureless fluff.
I held her hand. Her soft body was stuffed with cotton, and the stitches, my own handiwork, weren't very neat. I had also made her a scarlet felt cape, like the ones in the fairy tales I had read as a child. Her two ears were of different lengths, and the longer one drooped, as though dejected.
Seeing her, I couldn't help but remember all the experiences of failure in my life: eggshell puppets that I had ruined during crafts class; drawings that didn't look like what they were supposed to be; stiff, awkward smiles in photographs; chocolate pudding burnt to charcoal; failed exams; bitter fights and breakups; incoherent classroom reports; papers that were revised hundreds of times but ultimately were unpublishable ...
Nocko turned his fuzzy little head to regard us, his high-speed cameras scanning, analyzing Lindy's form. I could almost hear the computations churning in his body. His algorithms were designed to respond only to speaking subjects.
"Nocko, this is Lindy." I beckoned him over. "Come say hi."
Nocko opened his mouth; a yawn-like noise emerged.
"Behave." I raised my voice like a mother intent on discipline.
Reluctantly, Nocko muttered to himself. I knew that this was a display intended to attract my affection and attention. These complicated, pre-formulated behaviors were modeled on young children, but they were key to the success of language-learning robots. Without such interactive behavior feedback, Nocko would be like a child on the autistic spectrum who cannot communicate meaningfully with others despite mastering a whole grammar and vocabulary.
Nocko extended a furry flipper, gazed at me with his oversized eyes, and then turned to Lindy. The designer had given him the form of a baby white seal for a reason: anybody who saw his chubby cheeks and huge, dark eyes couldn't help but let down their guard and feel the impulse to give him a hug, pat his head, and tell him, "Awww, so good to meet you!" Had he been made to resemble a human baby, the uncanny valley would have filled viewers with dread at his smooth, synthetic body.
"Hel-lo," he said, enunciating carefully, the way I had taught him.
"That's better. Lindy, meet Nocko."
Lindy observed Nocko carefully. Her eyes were two black buttons, and the cameras were hidden behind them. I hadn't bothered to sew a mouth for her, which meant that her facial expressions were rather constrained, like a princess who had been cursed to neither smile nor speak. I knew, however, that Lindy could speak, but she was nervous because of the new environment. She was being overwhelmed by too much information and too many choices that had to be balanced, like a complicated board situation in weiqi in which every move led to thousands of cascading future shifts.
My palm sweated as I held Lindy's hand; I felt just as tense.
"Nocko, would you like Lindy to give you a hug?" I suggested.
Pushing off the floor with his flippers, Nocko hopped a few steps forward. Then he strained to keep his torso off the floor as he spread his foreflippers. The corners of his mouth stretched and lifted into a curious and friendly grin. What a perfect smile. I admired him silently. What a genius design. Artificial intelligence researchers in olden times had ignored these nonlinguistic interactive elements. They had thought that "conversation" involved nothing more than a programmer typing questions into a computer.
Lindy pondered my question. But this was a situation that did not require her to give a verbal answer, which made the computation much easier for her. "Yes" or "no" was binary, like tossing a coin.
She bent down and wrapped two floppy arms around Nocko.
Good, I said to myself silently. I know you crave to be hugged.
During the last days of his life, Alan Turing created a machine capable of conversing with people. He named it "Christopher."
Operating Christopher was a simple matter. The interlocutor typed what they wished to say on a typewriter, and simultaneously, mechanisms connected to the keys punched patterns of holes into a paper tape that was then fed into the machine. After computation, the machine gave its answer, which was converted by mechanisms connected to another typewriter back into English letters. Both typewriters had been modified to encode the output in a predetermined, systematic manner, e.g., "A" was replaced by "S," and "S" was replaced by "M," and so forth. For Turing, who had broken the Enigma code of the Third Reich, this seemed nothing more than a small linguistic game in his mystery-filled life.
No one ever saw the machine. After Turing's death, he left behind two boxes of the records of the conversations he had held with Christopher. The wrinkled sheets of paper were jumbled together in no apparent order, and it was at first impossible for anyone to decipher the content of the conversations.
In 1982, an Oxford mathematician, Andrew Hodges, who was also Turing's biographer, attempted to break the code. However, since the encryption code used for each conversation was different, and the pages weren't numbered or marked with the date, the difficulty of decryption was greatly increased. Hodges discovered some clues and left notes, but failed to decipher the contents.
Thirty years later, to commemorate the one hundredth anniversary of Turing's birth, a few MIT students decided to take up the challenge. Initially, they tried to brute force a solution by having the computer analyze every possible set of patterns on every page, but this required enormous resources. In this process, a woman named Joan Newman observed the original typescript closely and discovered subtle differences in the abrasion patterns of keys against paper on different pages. Taking this as a sign that the typescript was produced by two different typewriters, Newman came up with the bold hypothesis that the typescript represented a conversation between Turing and another interlocutor conducted in code.
These clues easily led many to think of the famous Turing test. But the students initially refused to believe that it was possible, in the 1950s, for anyone to create a computer program capable of holding a conversation with a person, even if the programmer was Alan Turing himself. They designated the hypothetical interlocutor "Spirit" and made up a series of absurd legends around it.
In any event, Newman's hypothesis suggested shortcuts for future code-breakers. For instance, by finding repetitions in letter patterns and grammatical structures, they attempted to match up pages in the typescript to find questions and their corresponding answers. They also attempted to use lists of Alan Turing's friends and family to guess the name of the interlocutor, and eventually, they found the cyphertext for the name "Christopher" — possibly a reference to Christopher Morcom, the boy Turing had loved when he was sixteen. The young Alan and Christopher had shared a love of science and observed a comet together on a cold winter night. In February of 1930, Christopher, aged only eighteen, died from tuberculosis.
Turing had said that code-breaking required not only clever logical deduction, but also intuitive leaps, which were sometimes more important. In other words, all scientific investigations could be understood to be a combination of the exercise of the dual faculties of intuition and ingenuity. In the end, it was Newman's intuition and the computer's cleverly programmed logic that solved the riddle left by Turing. From the deciphered conversations, we learned that "Christopher" was no spirit, but a machine, a conversation program written by Turing himself.
A new question soon presented itself — could Turing's machine truly respond like a human being? In other words, did Christopher pass the Turing test?
iWall was mostly dark, save for a few blinking numbers in the corner notifying me of missed calls and new messages, but I had no time to look at them. I was far too busy to bother with social obligations.
A small blue light lit up, accompanied by a thudding noise as though someone was knocking. I looked up and saw a bright line of large text across iWall.
5:00 PM. TIME TO TAKE A WALK WITH LINDY.
The therapist told me that Lindy needed sunlight. Her eyes were equipped with photoreceptors that precisely measured the daily dose of ultraviolet radiation she received. Staying cooped up in the house without outdoor activity wasn't good for recuperation.
I sighed. My head felt heavy, cold, like a lead ball. Taking care of Nocko was already taking a lot out of me, and now I had to deal with — no, no, I couldn't complain. Complaining resolved nothing. I had to approach this with a positive attitude. No mood was the simple result of external events, but the product of our understanding of external events at the deepest level. This cognitive process often happened subconsciously, like a habit, and was finished before we even realized it was happening. Often we would fall into the clutches of some mood but could not explain why. To change the mood then by an act of will was very difficult.
Take the same half-eaten apple: some would be delighted upon seeing it, but others would be depressed. Those who often felt despondent and helpless had become habituated to associating the remains of a whole apple with all other losses in life.
It was no big deal; just a stroll outside. We'd be back in an hour. Lindy needed sunlight, and I needed fresh air.
I could not summon up the energy to put on makeup, but I also didn't want everyone to stare at my slovenly appearance after staying cooped up at home for the last few days. As a compromise, I tied my hair into a ponytail, put on a baseball cap, pulled on a hoodie and a pair of sneakers. The hoodie I had bought at Fisherman's Wharf in San Francisco: "I ? SF." The texture and colors reminded me of that summer afternoon long ago: seagulls, cold wind, boxes of cherries for sale by the wharf, so ripe that the redness seemed to ooze.
I held Lindy's hand tightly, exited the apartment, rode the elevator down. The tubes and iCart made life easier. To go from one end of the city to the other, to go directly from one high-rise to another, required less than twenty minutes. In contrast, to get out of my building and walk outside required far more effort.
Overcast sky. Light breeze. Very quiet. I walked toward the park behind the building. It was May and the bright spring flowers had already wilted, leaving behind only pure green. The air was suffused with the faint fragrance of black locust trees.
Very few people were in the park. On a weekday afternoon, only the very old and very young would be outside. If one compared the city to an efficient, speedy machine, then they lived in the nooks and crannies of the machine, measuring space with their feet rather than the speed of information. I saw a little girl with pigtails learning to walk with the help of an iVatar nanny. She held the iVatar's thin, strong fingers with her chubby fists, looking at everything around her. Those dark, lively eyes reminded me of Nocko. As she toddled along, she lost her balance and fell forward. The iVatar nanny nimbly grabbed her and held her up. The girl squealed with delight, as though enjoying the new sensations. Everything in the world was new to her.
Opposite the little girl, an old woman in an electric wheelchair looked up, staring sleepily at the laughing figure for a few seconds. The corners of her mouth drooped, perhaps from moroseness, or perhaps from the weight of the years she had lived through. I couldn't tell her age — these days, practically everyone was long-lived. After a while, the woman lowered her eyes, her fingers gently cradling her head with its sparse crown of white hair, as though falling asleep.
I had the abrupt feeling that the old woman, myself, and the girl belonged to three distinct worlds. One of those worlds was speeding toward me while the other was receding farther and farther away. But from another perspective, I was the one slowly strolling toward that dark world from which no one ever returned.
Lindy shuffled her feet to keep up with me without saying anything, like a tiny shadow.
"The weather is nice, isn't it?" I whispered. "Not too hot, and not too cold. Look, dandelions."
Next to the path, numerous white fuzzy balls swayed in the breeze. I held Lindy's hand, and we stood there observing them for a while, as though trying to decipher the meaning of those repetitious movements.
Meaning was not reducible to language. But if it couldn't be spoken about, how could it exist?
"Lindy, do you know why you're unhappy?" I said. "It's because you think too much. Consider these wild seeds. They have souls also, but they don't think at all. All they care about is dancing with their companions in joy. They couldn't care less where they're blown by the wind."
Blaise Pascal said, "Man is only a reed, the weakest in nature, but he is a thinking reed." However, if reeds could think, what a terrifying existence that would be. A strong wind would fell all the reeds. If they were to worry about such a fate, how would they be able to dance?
Lindy said nothing.
A breeze swept through. I closed my eyes, and felt my hair flapping against my face. Afterward, the seed balls would be broken, but the dandelions would feel no sorrow. I opened my eyes. "Let's go home."
Lindy remained where she was. Her ear drooped. I bent down to pick her up and walked back toward the building. Her tiny body was far heavier than I imagined.
In a paper titled "Computing Machinery and Intelligence" published in the journal Mind in October of 1950, Turing considered the question that had long troubled humans: "Can machines think?" In essence, he transformed the question into a new question: "Can machines do what we (as thinking entities) can do?"
For a long time, many scientists firmly held to the belief that human cognition was distinguished by certain characteristics unattainable by machines. Behind the belief was a mixture of religious faith as well as theoretical support from mathematics, logic, and biology. Turing's approach bypassed unresolvable questions such as the nature of "thinking," "mind," "consciousness," "soul," and similar concepts. He pointed out that it is impossible for anyone to judge whether another is "thinking" except by comparison of the other with the self. Thus, he proposed a set of experimental criteria based on the principle of imitation.
Imagine a sealed room in which are seated a man (A) and a woman (B). A third person, C, sits outside the room and asks questions of the two respondents in the room with the purpose of determining who is the woman. The responses come back in the form of typed words on a tape. If A and B both attempt to convince C that they are the woman, it is quite likely that C will guess wrong.
If we replace the man and the woman inside the room with a human (B) and a machine (A), and if after multiple rounds of questions, C is unable to distinguish which of A and B is the machine, does that mean that we must admit that A has the same intelligence as B?(Continues…)
Excerpted from "Broken Stars"
Copyright © 2019 Ken Liu.
Excerpted by permission of Tom Doherty Associates.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Table of Contents
Salinger and the Koreans,
Under a Dangling Sky,
What Has Passed Shall in Kinder Light Appear,
The New Year Train,
The Robot Who Liked to Tell Tall Tales,
The Snow of Jinyang,
The Restaurant at the End of the Universe: Laba Porridge,
The First Emperor's Games,
REGINA KANYU WANG,
The Brain Box,
Coming of the Light,
A History of Future Illnesses,
A Brief Introduction to Chinese Science Fiction and Fandom by Regina Kanyu Wang,
A New Continent for China Scholars: Chinese Science Fiction Studies by Mingwei Song,
Science Fiction: Embarrassing No More by Fei Dao,
Tor Books Translated by Ken Liu,
About the Author,