Pencil

Reader

Read the latest posts from Pencil.

from Yu-Gi-Oh! COLORS

The harbour was bustling with activity when Yukari descended from the ship. There was a mix of all kinds of people as far as the eye could see, but she seemed to have been the only teenager to descend from a fishing ship, as opposed to all the people coming from a cruise one. After looking at it intently for a few seconds, she sighed and sat on one of the rusty bollards near the ship, fishing on her bag for a deck of cards.

The deck box was made out of dented, scratched brown leatherette. Inside the box was a mix of old cards with blown-out edges and full of creases on the edges, alongside a few brand new cards in a seemingly pristine condition. She took out the entire stack and fished around for one of them, leaving the rest back inside as a faint tug arose on the corners of her lips. She then proceeded to silently stare at it until her father had come down.

“Well, Yuu,” he took a deep breath. “I want to wish you the best of luck, but I also want you to fail hard so you can come back home with us. Crazy, isn’t it?” He claimed as he put his hands on his hips. Yukari’s father was a brawny tanned man with long black hair and a short unkempt beard. He put a hand on Yukari’s shoulder as his grinning eyes softened.

“Don’t worry about good ol’ pops, come on. As much as it hurts, kids need to leave their parents’ nest one day. What matters is that you have the courage to put your own foot forward for yourself.”

Yukari shook her head. She opened her mouth, but then closed it without saying anything. Her father took her deck case, slid a bank note inside and put it back on her hands. “Here, go buy some energy drinks and a candy bar. You need to be pumped in the exam!”

He gave her a hug. “I’ll be back at 6pm. Now go show those freshwater toads how we do it in the south!” He winked and walked away. Once he had boarded the ship again, she waved at him one last time and left for the train station. The sky was still a bright shade of pink when she boarded the train.

Even at the earliest hours of the day, Yukari was impressed by how densely packed the trains became. While not many people had boarded at the harbour, it took no time for all the seats to be taken and for people to lean back on the doors of the train as they pulled out their smartphones to play a mobile game or watch a livestream. Still, the only she could hear inside the car was the muffled announcement of the coming stations.

At one point, she noticed a baby calico cat with a huge head crying up and down the station. As the cat came to Yukari, she reflexively lifted it up, noticing a collar with the name Milano written on it. The reverse of the tag had a phone number attached to it, but no name.

She tried dialling, but there was no response. A few seconds later, a notification flashed on her screen.

«what ya cookin sis nobody makes calls nowadays www!!!» «u sure glad i was @ twitter cuz my phone is always on silence _('ω'_U⌒)シ»

The sender was listed as @YuunaruYuuna, and it depicted a middle schooler with blond hair and purle eyes.

«I'm sorry, I don't really text much,» she typed in response. «I found your cat! Are you on the train from Tokyo Harbour?»

The girl on the other side posted an emoticon with its hand on the chin. A second later, the typing notification flashed frantically.

«wait milano???» «you can see milano!!???» «im kettink there brb tont mohe!»

A few seconds later, the middle-schooler appeared in front of her. Before exchanging any words with Yukari, the middle schooler looked at the cat straight on the face, frowned, stuck her tongue at it and then laughed, before taking it in her own arms. The cat quickly clung onto her shoulder, balancing from her back with a pair of tiny wings Yukari had failed to note up until that moment.

“How come you can see Milano, anyway...!” She then spouted to Yukari. Seeing her shrink in fear, the girl laughed it off. “No, I mean, it's not like it's BAD that you can see him — it's just that you're, like, the first person that can see him, you know...?” “What do you mean?” Yukari blurted. “He's here, I can touch him. He has… Weight.” “Well, you see — almost nobody can see this bag of fluff right here. Sometimes people look at me like I'm crazy, and I can tell that they do, but you said it yourself. He's like very chonki,” Yuuna explained as the pretended to heave the cat, “and loves to cling everywhere. Which is why I had to give him a collar with my LINE ID — and the phone number, just in case.” “Hmm.” “But anyway, sis, where'cha goin'?”

Yukari's ears flashed a soft shade of pink over her own tanned skin. She stroked the back of her head with her hand, and looked around in mild embarassment.

“I'm- Well, you know this card game people play on TV for money and stuff-” “Yeah, Duel Monsters. Don't be shy, sis, everyone plays that game nowadays. I heard the major is like one of the big fish in the Duel Monsters industry and won't let you apply for residency unless you have an I-Three-D card!”

The Industrial Illusions ID card -usually abbreviated as I3D card in text form- was an ID created by Kaiba Corporation that allowed duelists to use their duel disks and sign up for tournaments. Whenever a duelist purchased a duel disk, they would be prompted to introduce an existing I3D card in a certain slot, whose location varied throughout different iterations of the device. Yukari, having only sloppily read throughout the extensive rules and regulations of the school she was trying to apply to, felt her heart sink as she realised she may be denied entrance to the building once she fails to show an I3D card.

“Wait, so I cannot take the entrance exam for Duel Academia without one of those IDs!? I-I thought you only needed your regular ID...!”

Despite her best efforts, Yuuna couldn't stop herself from snickering at Yukari's desolate eyes and half-gaping mouth.

“Oh, come now. You got a duel disk, don't ya?” “I, uh-”

Yukari took an old, battered duel disk from her bag and handed it to Yuuna. It had a metallic colour throughout and looked like a small UFO plate with a sharp wing on both sides. Being the first commercially available model to have been produced, its age showed through its burst LCD screen, dusty ventilation holes and severely scratched casing. Despite Yukari's best efforts, she barely managed to get it in a state where it was no longer revolting to the touch. Yuuna couldn't help but wonder whether the duel disk worked at all.

“Yeah… Even this fossil must have NFC, so don’t sweat it.” She handed it back to Yukari, who put it back in her bag. A muffled voice announced that the next stop would be Domino City. “So anyway!” Yuuna announced, as they walked through the station tunnels, “name’s Yuuna. What about ya?” “Y-Yukari Kajiki. Nice to meet you.” “Come on, sis, skip the formalities. Anyone who can see Milano is bestie material.”

Once the train had reached Domino City, most of the people boarding the early morning train left. Everyone -Yuuna included- started setting their duel disks on their arm, revealing the wide variety of designs that had been released over the course of the years. Yuuna herself took what seemed like a red tablet PC out of her backpack and put it on top of her forearm, causing two metal arms to spring from beneath and tightly grip on her arm. She then prompted Yukari to do the same with her UFO-shaped device.

The station was full of holographic advertisements featuring fantasy creatures, most of which seemed completely foreign to Yukari. Stemming from the art style, she had a vague notion of them being Duel Monsters creatures, but she could barely recognise a few ones. There were also posters hanging on the walls, depicting a pale young man wearing a black coat, describing an upcoming tournament to be celebrated at the Kaibaland amusement park.

“And what’s yer engine, bestie?” “I- Uh- I-It's a Ritual/Xyz WATER deck.” “That's not an engine,” Yuuna lamented. “Like, what are the core cards in yer deck?” “I… Well, I use A Legendary Ocean to boost my monsters’ attack because they are all WATER type. Also, I have Leviath Dragon to help me out of a tough spot...” “...so you gathered a buncha' cards with some loose synergy out of boosters to make your deck, didn't ya?” “Yes,” Yukari admitted in a lamenting tone. “But… That’s how you make decks, right? You open boosters!“ “Yer gonna waste a ton o’ money by doing that, bestie. Yer meant to buy the cards you need! Like, on the internet, or in a card game shop!” Yukari’s face turned pale as Yuuna explained. ”Okay, okay. Let's start by the start, right? Did ya get a structure deck when you were first starting?” “A what now? This is all stuff from back when my dad played the game. That is why the duel disk is so badly knackered...” She admitted, her voice gradually becoming quieter as she went on. “He told me to go to Absolute Zero because they had this event where you can open packs and get whatever cards you wanted, to make a deck and stuff.”

Yuuna figured Absolute Zero must have been a local game shop.

“Yeah, that's draft. But imo you shoulda also get, like, three copies of a structure deck to jumpstart yourself.” “Three copies...!? But one of them alone costs ¥1500 already! I am just a beginner!” “Yer eyes say 'no', but ya heart says 'I been thinking about it so much I know exactly how much it costs'.” “Ok, fine! I may have been thinking about it! Not that I can do much to help it now.” She checked the time on her phone. It was a light blue feature phone with a clamshell design. “Anyway, I should get going to-”

Yuuna grabbed Yukari's arm to call her attention, pointing to a mom-and-pop shop where a middle-aged woman was sweeping the ground. The shop had a sign with a wooden turtle beside the entrance, showing the name «Kame Game» in stylised, upper-case English letters and the timetables. As the sweeping woman lifted her eyes, she waved at Yuuna with a smile on her face.

“Yukari, check out this place. Kame Game is part of Duel Monsters' history! I'm sure the King of Games himself's telling you to buy that structure deck you're longing after.” “What!? But Yuuna, I will be late to the entrance exam!” “It's ok, Yukari, the mayor is probably telling one of his boring speeches about how rizzsome and poggers studying is! That guy's always spouting lotty nonsense about being a smartipants and whatnot, so we got plenty a' time.” “Okay, fine, whatever…”

The walls of the shop were full of posters and prints promoting the newest Duel Monsters sets, along with framed newspaper cut-outs showcasing a young boy with spiky hair and a childlike grin on his face. The shelves and cabinets were chocked full of flat boxes, displaying all manners of brands and logos Yukari had never heard about before. She also noticed sleeved Duel Monsters cards with gold lettering and a shiny finish, displayed as trophies on glass cabinets. As they advanced to the counter at the deep end of the room, she noticed a plastic display -not dissimilar to those used in news stands to store sweets- chocked full of booster packs, and an array of plastic packages on a wire frame rack. However, there was nobody on the other side of the counter. Yuuna, who was only about one head's height above the counter, hopped in a hopeless attempt to see if there was someone bending behind.

“Should we be here?” Yukari asked, uncomfortable. “The owners are going to think we have stolen something...”

“It's fine, they're sick of seeing good ol' me. I may look like a crime in the making, but I'm a good girl very down below.” She posed an exaggerate smile to Yukari.

“We're all good very down below, you know?” A deep voice outlined from another room. A tall, blonde man with a 5 o’clock shadow came into the shop from a backdoor and closed behind him. “And you, Yuuna, what's with those manners!? No 'good morning', no 'excuse me'?” “I greeted Mrs. Mutou!” “What about good ol' me?” “Excuse us for coming in unprompted, and nice to meet you,” Yukari interjected, bowing deeply. The blond man behind the counter rubbed his nape in shame, only now realising Yuuna had brought someone else with her. “See, Yuuna, this is more like it.” He crossed his arms. “Well, I guess not that formal, since you're Mai's kid and all...” “Anyway, nice to meet you. Name's Jonouchi.” He pointed at himself with the thumb. “Yukari Kajiki, Mr. Jonouchi. Nice to meet you.” “Please, don't be so uptight! Yuuna's friends are the shop's friends, so just call me Joey.” “Anyway, Joey! Yukari has no idea about deck-building, and she was trying to take the Duel Academia entrance exams today!” “Wait a second, Yuuna – the opening speech must have begun already! Why on Earth did you bring your friend here!?” “She said her engine is a Ritual/Xyz WATER deck! The examiners are going to eat her alive! Plus, who cares about that pompous guy with the white limo...?”

Jonouchi failed to hide a malicious grin after hearing the last remark.

“Yeah, well, but still! The entrance exam's not the world championship, ya know? She'll probably be fiiiine.”

Yuuna nudged Yukari with her elbow.

“Actually... There is something we have been discussing while riding the train.” She then remembered the words of her therapist, prepared the sentence in her mind and spitted it out in one go: “do you have Ice Barrier of the Frozen Prison, by any chance?”

Yuuna and Jonouchi stared at one another, dumbfounded. Rumor had it that fans had asked Industrial Illusions to make an Ice Barrier structure deck as a joke, as the archetype had nothing good besides the ace monsters. It was hard to deny the innocent enthusiasm Yukari showed as she mustered the courage to ask about a Structure deck she had thought about for long enough to recite its full name by heart, but they were uneasy about recommending it, as there was little one could salvage out of it.

Hesitant, Jonouchi took three ice-blue boxes with dragons plastered on the front cover from behind the counter, slapping them onto the desk. “Now, I don't know the contents of the structure decks by heart, but I can definitely give you a bunch of staples to go with them,” he muttered, taking a thick, white binder out of a shelf behind him. “Wait a second, Mr Jono-” Yukari tried to interject. “Mr. Joey, I cannot just go and spend ¥4500 on cards like nothing!”

Jonouchi pressed the binder against his chest. “You should play Vanguard instead, it is even cheaper than Duel Monsters.” “But Joey, there must be something we can do to improve her deck on a budget! Yukari, show him your deck!”

Yukari handed Jonouchi the stack of cards. As he looked through them one by one, a wide smile blossomed across his face, sometimes stopping at particular cards and shaking his head. Once he had put the deck down over the counter, he wiped off a tear with the edge of his hand.

“Kajiki, huh… I knew that name was familiar from somewhere,” Jonouchi muttered. “Let’s make a deal, okay? I will take you to the entrance exam-“ Yuuna lifted her fist in silence. “And I am going to watch your duel, Yukari. If you get admitted, these cards are yours, and we will talk about the Ice Barrier deck you’ve been dreaming about.” “I can see it now,” Yuuna interjected. “We gonna be roommates in no time flat!” “Wait, roommates!?” Yukari asked. “You are also going to take the entrance exam?”

Yuuna assented energetically, a dazzling smile across her face.

“That's why I need to make sure you're in top shape, Yukari! We're going to be top students, and when we graduate, they'll have us face each other in an epic climax! It'll be so cool!” “It is a barebones beat-down full of singles,” Jonouchi explained, “but I guess she worked her ass out to come up with this strategy on such tight restrictions, so it should work.”

Jonouchi threw his car keys at Yuuna, who promptly left the shop. Before Yukari could follow her, Jonouchi patted her shoulder, and handed her a pack of purple card sleeves.

“Yukari, I know it wasn't in your plans to hit a small shop in the middle a' nowhere on such an important day, but I am very happy to have met Ryouta's kid. You sure have stirred the pot of old memories quite a bit by coming here.” “Ryouta? Do you know my dad?” “Yeah. How would I know, right? I guess I was still a third-rate duelist with a fourth-rate deck, back in the Battle City days.”

 
Read more...

from Bunker Labs

by Chloé VULQUIN

In recent times, as an industry, we've been building systems that are progressively less likely to fail. From the erlang's internal retries, to formally verified languages, to rust's borrow checker. These all have a place. However, it's a different thing to say that those projects and languages that are not in the same category do not have a place.

Making systems that are resistant (making a fail-free system is physically impossible, everything can and will go wrong) to failure is not free. The costs are many and come from various places. This type of software is harder and takes longer to write. The languages intended to make this easier require particular styles and restrictions to make it possible. They require additional resources to run, and are much more complex to tune, as making a perfect system is impossible, and therefore specific tuning is left to configuration time. Such systems demand more of the writer, the user, and the environment.

Furthermore, attempts to make systems fail less can actually make them fail more. The loss of a quorum is a common example in early HA (High Availability) SQL setups. There's nothing wrong anywhere, all that happened was a minor ping latency hiccup. A packet arrived late, or retried a few times, and by the time it made it over, the nodes decided that quorum is lost. What you end up with is a soft failure, where everything is running but needs manual intervention to run as intended. Were it a smaller, simpler, system, this issue would not be noticed in any way. The attempt to make something more failure-resistant has made it more sensitive to other types of environmental issues. This is another cost to these systems.

As with any tradeoff, then, it's important to make a cost-benefit analysis to determine if this particular tradeoff is worthwhile in the circumstances at hand. Let's start with the benefits.

Why would you want a system that doesn't fail, or at least fails less often? Well, for one, it's annoying when your system is down! Perhaps that's enough to justify the above costs if your system fails all the time. Then again, if your system fails all the time, you might have other problems. No, it would take something bigger, a real cost to failure.

These fail-resistant systems are obviously critical in applications like medicine, aerospace, and more, where failure is not an acceptable option. You definitely don't want a program ensuring your safety and ability to operate a rocket to fail due to unexpected user input (being chromium and javascript), or perhaps fail to deploy an automated parachute. The cost of failure in these cases is a human cost. As such, it's perfectly acceptable to sink a lot of effort into avoiding it.

Moving to somewhat less dramatic pastures, sometimes the cost is not human, but monetary. Software businesses and businesses relying on software for mission-critical operations can attribute a real and direct cost to every second of downtime. The Amazons and the Facebooks and the Fords of the world can perform an elementary analysis of dollar per second vs dollar per increased safety feature. It is then no wonder that it is often these large companies that are searching for ever more developers for these languages, those experienced in building failure-resistant systems. A rust developer likely won't even ask for more money than a C++ one, even if they probably should 🦀.

Finally, sometimes, the cost is legal. Sure, no one's going to die if the service dies randomly once every year. Sure, you're not losing tangible money out of it, or even intangible. But gosh darn, you signed that paper that promised a given SLA, so now you have to deliver it. Your hands are tied, and it hardly matters how you increase availability, but you've got to.

You'll notice that conspicuously missing from all of these scenarios is the small scale with low stakes. If you're hosting an RSS reader for your friends, whether it goes down once a year or three times a year doesn't matter – you're probably rebooting the single machine it's on more often than that. Oh, sure, it could go down even less if you set up HA, but why bother? No one will die, you won't lose any money, hell chances are no one will even notice, including you!

The cost-benefit analysis simply doesn't pan out. Most people already don't have HA set up for their personal or homelab services, the cost to the environment and to their maintenance of the software is higher than they're willing to accept for the little benefit gained. At those scales, performance, too, hardly matters. Or, more accurately, throughput is irrelevant, while latency is crucial. Have I mentioned making systems failure-resistant incurs a latency penalty?

So why then are the people running their own services often running “production-ready” “professional” “HA” systems? There are a few answers. Firstly, the cost is not visible to the user. They can just not enable the HA features (well, sometimes, anyway). And since they're not contributing to the software, nor are they the author, they do not see the increased maintenance cost. Additionally, oftentimes they won't be aware of any alternatives, if they even exist at all. The personal cost of writing something new is much higher than the cost of dealing with whatever is already out there, while smaller already-written systems will often be personalized or too small to be easily discovered. This does not, however, mean that it's not worth doing.

In the meanwhile, what pushes people to write software that is failure-resistant? Here too, the answers are many, but are not difficult either. The most obvious one is that failure conditions are bad. Nobody likes them! Humans are notoriously bad at estimating costs, time required and similar. It's a common joke that engineers will simply answer “it will be done once it is done”, and have no input to provide beyond this. An author is disproportionately likely to underestimate the cost that they are making themselves pay in advance, and therefore highly likely to not perform a cost-benefit analysis for their use case.

Sometimes, though, there is an analysis going on, though it is of a different nature. Sometimes, people write software not because they want to use it, but because they hope to get a job, or build a portfolio, or for someone else's needs. The biggest demand for software comes from corporations, and for the reasons mentioned above, corporations like HA failure-resistant systems, and are the entities most likely to hire you. As such, if you're writing the software for any of the above reasons, you're also much more likely to make them failure-resistant.

Because of all of this, most pieces of infrastructural software, where the stakes are low for most, medium for some, but never of a human cost, tend to be failure-resistant, and absolutely a pain to run. Authoritative DNS servers, email servers, collaboration software, and more. All of these suffer from this effect. So consequently, many people will never host their own authoritative DNS, or their own email, and so on. This in turn creates space for corporations to provide these as services for others. Since the name of the game is convenience, those aren't always performant, or even configured in a failure-resistant way, putting us back in square one. The price of this societal level issue ends up being paid by those that do not have the time to maintain software that is as a category needlessly obtuse, either monetarily, or by not having access to such functionality at all.

So in conclusion, and as advice – don't blindly make things failure-resistant. Perform cost-benefit analysis, avoid underestimating the costs to yourself, and if it is so justified, optimize, but not prematurely. It's ok for your software to fail sometimes; it's ok to focus on the happy code path; sometimes it might just make the world better to fail.

One of the reasons bunker labs posts are the way there are is precisely for this reason. In practice, failure-resistance is bolted on after the fact. When it isn't, you had better be getting paid well for it (again, rust devs, ask for higher salaries, I am not kidding). The final advantage to building systems that fail is that you still built a system; learned more about the subject and improved in your craft. That's what I hope to help achieve with what I write, at least on here. You can potentially use what I write in the real world, but maybe write your own instead. :)

 
Read more...

from Bunker Labs

by Chloe Kudryavtsev

People are terrified of parsers and parsing. To the point of using magical libraries with custom syntaxes to learn just to get started. In the hopes of completely shattering this preconception, I will write a parser for the “ini” file format in about 150 lines of pure and readable ISO C99. Furthermore, this parser will be something that's nice to use and has error correcting features, such that it is actually useful and usable outside this example setting.

Design

There's no standard for the ini format, and all the existing implementations disagree, so let's take some liberties, XKCD 927 style. No newlines in keys, values, or section names. Empty values are not allowed. Comments only on their own lines (minus whitespace). Whitespace-insensitive (whitespace at the start of line, end of line, around the “=”, is all ignored). No need for a terminating newline either. Oh that's more than most C ini parsers do? Isn't that convenient.

To get the jargon out of the way, the parser will be recursive-descent LL(1).

The API will have a single entry point. You call parse_ini against a FILE*, with optional userdata (just any context the API user wants to have available in their callback), and a required callback. The callback will be called for every key-value pair, and it will receive the section, key, and value, alongside the userdata. All of the strings in question will be temporary private data, and must be copied over (for simplicity). If a given terminal does not fit inside the private data, the given value is truncated, but the parsing continues without error.

Implementing

Let's start by writing the top-level function. We macro-define the max length. This is done firstly so that it can be changed (by surrounding it with ifdefs later), and secondly because we're allocating temporary memory. Typical parsers already operate in quadratic time, adding countless allocations and reallocations for a use-case where all values are typically small isn't reasonable.

#define INI_SEC_MAXLEN 64
#define INI_KEY_MAXLEN INI_SEC_MAXLEN
#define INI_VAL_MAXLEN INI_KEY_MAXLEN * 16

Now we can write the top-level parse_ini function. The effective PEG for it is ini <- expr*. We start from the top-level because we have a specific idea for the UX, so we need to write the rest of the functions in a way as to ensure we can deliver on it.

// if the callback returns non-zero, parsing will stop
typedef int (*callback)(const char*, const char*, const char*, void*);
// we return the number of bytes handled, sort of, you'll see
int parse_ini(FILE *src, void *userdata, callback cb) {
    char section[INI_SEC_MAXLEN] = {0};
    char key[INI_KEY_MAXLEN] = {0};
    char value[INI_VAL_MAXLEN] = {0};
    int status, out = 0;
    // we stop going whenever we fail to consume any data, explicitly error,
    // the stream errors, or the stream ends
    while ((status = parse_expr(src, userdata, section, key, value, cb) >= 0) {
        out += status;
        if (feof(src) || ferror(src)) break;
    }
    return ferror(src) ? -out : out;
}

Nothing too complicated here, you just iterate the stream until something makes us stop. We return negative values in the case of an irrecoverable stream error that isn't EOF. We get into the real meat in parse_expr.

Note that all the future functions and definitions are static, but I'm omitting that keyword (as well as the occasional inline for brevity – you don't need to see it in this case).

In parse_expr, we want to parse an expression. In an ini file, you have comments, section declarations, and key-value pairs. We also want to skip whitespace. So let's do exactly that. The effective PEG is expr <- ws* (section / comment / kv).

int parse_expr(FILE *src, void *userdata, char *section, char *key, char *value, callback cb) {
    int len = parse_skipws(src);
    if (len) return len; // to avoid confusing byte counts, we're in a loop anyway

    int c;
    // figure out the next expression
    switch ((c = fgetc(src))) {
        case EOF:
            return 0; // let the outer loop figure out if this was an error or not
        case '[': // a section
            return parse_section(src, section);
        case '#':
        case ';':
            return parse_skipuntil(src, "\n");
        default: // key-value pair
            ungetc(c, src); // we need to conserve this one
            return parse_kv(src, userdata, section, key, value, cb);
    }
}

Note: we don't need to ungetc in sections or comments, since the only real use of the character is to identify the type of expression. Before we get into the implementations of parse_section and parse_kv, let's write the utility functions we'll need, such as parse_skipuntil.

Let's start with parse_skipwhile and parse_skipuntil, which will skip characters while a condition holds, or until a condition occurs. We'll define a condition as the next character being in a secondary argument (a string), a check we can perform using strchr. Neither of these have direct PEG alternatives, since this approach is closer to a lexer technique, which we can utilize precisely because we're writing a hand parser.

int parse_skipuntil(FILE *src, const char *s) {
    int out = 0;
    for (int c; (c = fgetc(src)) != EOF; out++) {
        if (strchr(s, c)) return out;
    }
    return ferror(src) ? -out : out;
}
int parse_skipwhile(FILE *src, const char *s) {
    int out = 0;
    for (int c; (c = fgetc(src)) != EOF; out++) {
        if (!strchr(s, c)) {
            ungetc(c, s);
            return out;
        }
    }
    return ferror(src) ? -out : out;
}

Note that they're almost the same. The primary notable difference is the negation of the strchr call, and that skipwhile does an ungetc (skipuntil consumes the terminator).

We can now define parse_skipws. It's actually quite simple.

// the characters we consider to be whitespace
const char wss[] = " \t\r\n";
#define parse_skipws(src) parse_skipwhile(src, wss)

Before we move on to the juicy stuff, we'll implement one more thing: parse_until. Semantically, it's the same as parse_skipuntil, but it will actually write what it reads into a string pointer, up to its maximum length. Then it just becomes parse_skipuntil. You could do the same thing with parse_skipwhile too.

int parse_until(FILE *src, char *ptr, ssize_t maxlen, const char *s) {
    int out = 0, c;
    while (out < maxlen) {
        c = fgetc(src);
        if (*ptr == EOF) { // hit error while scanning
            *ptr = 0;
            return ferror(src) ? -out : out;
        } else if (strchr(s, c)) {
            *ptr = 0;
            return out;
        }
        (*ptr++) = c; out++;
    }
    // we only make it here if we hit maxlen
    (*--ptr) = 0;
    int skipped = parse_skipwhile(src, s);
    if (skipped > 0) {
        return out + skipped; // errors are negative, eof is ok
    }
    return ferror(src) ? (skipped - out) : (out - skipped);
}

We can now write parse_section and such!

#define parse_section(src, section) parse_until(src, section, INI_SEC_MAXLEN, "]\n")

Yeah, that's all there is to it. Note that we also allow terminating with a newline. This way, if someone forgot a “]”, we can still parse it. Though it does break comments on the same line.

Let's also write parse_key and parse_value, since we'll need them in parse_kv in a second.

int parse_key(FILE *src, char *key) {
    int out = parse_until(src, key, INI_KEY_MAXLEN, "=\n");
    return stripright(key, wss);
}
int parse_value(FILE *src, char *value) {
    int out = parse_until(src, value, INI_VAL_MAXLEN, "\n");
    return stripright(value, wss);
}

First of all, we do the same thing with the key that we did with the section: if a key is not terminated by an “=”, presume the newline is the “=”. This way, in the absolute worst-case scenario, two key-value pairs get corrupted, and the rest parses fine, though it may be a good idea to check the value of “out” too.

Secondly, you'll note the “stripright”. We already discard leading whitespace in parse_expr, but to strip trailing whitespace after the key (so before the “=”) and the value (so before the “\n”), we'll need to perform string brain surgery.

It's not too bad though, look:

// returns the number of output bytes
int stripright(char *c, const char *s) {
    ssize_t len = strlen(s);
    if (!len) return len; // already empty
    while ((--len) >= 0 && strchr(s, c[len])) {}
    // either strchr failed or len is now -1
    if (len < 0) {
        *c = 0;
        return 0;
    }
    c[++len] = 0;
    return len;
}

We can now finish the parser out with the most complex function in it: parse_kv with an incredible 12 lines of code, including a callback. The closest PEG analogue is kv <- key ws* value (where's the “=”? parse_kv takes care of it, and the whitespace before it for us).

int parse_kv(FILE *src, void *userdata, const char *section, char *key, char *value, callback cb) {
    int len = 0, tmp;

    tmp = parse_key(src, key); // consumes the =
    // if the key doesn't have a value or errored, we can't continue
    if (tmp <= 0 || feof(src)) return 0;
    len += tmp;

    tmp = parse_skipws(src); // the whitespace after the "="
    // if the value would have been empty, it ends up using the next line as the value
    // it may not error or eof, but it may be empty
    if (tmp < 0 || feof(src)) return 0;
    len += tmp;

    tmp = parse_value(src, value);
    // any errors are fine, since we've finished parsing now
    len += tmp > 0 ? tmp : -tmp;

    // let callback request terminating the parse by returning non-zero
    if (cb(section, key, value, userdata)) len *= -1;
    return len;
}

And that's it, you have a functioning parser.

Discussion

To explain what's going on here on a less “here's the code, enjoy” level, we need to explain some of the jargon.

A grammar is the definition of the language being parsed. One may intuitively presume that the grammar for a given language or format is merely itself, i.e. that the BNF form is canonical. However, this is not the case. A grammar is actually dependent upon the parser in use. To discuss the platonic ideal of the language being recognized, we use the term “language”.

Any given language can have an arbitrary number of grammars describing it. These grammars are considered functionally equivalent. Since the language rarely if ever specifies expected resolutions to edge-conditions (if it did, it would be a “deterministic” language), grammars can be functionally equivalent while exhibiting different behaviors.

For example, let's consider the Lua language. The Lua language includes the following rules:

prefixexp <- var | functioncall | '(' exp ')'
functioncall <- prefixexp args | prefixexp ':' Name args
var <- Name | prefixexp '[' exp ']' | prefixexp '.' Name

This grammar is left-recursive. This means that a category of parsing algorithms cannot parse it! There are a number of reformulations that you can apply to achieve a different grammar that does not have the left-recursion problem, which will then recognize the Lua language successfully, but may have different edge-case behaviors (that are nowhere in the spec).

In short, there are a few reasons that parsing is a mess, and none of those reasons are actually resolvable by parser generators. For example, you will need to modify your grammar to get it to run on, say, tree-sitter, or antlr4, or Janet's PEGs.

A lot of the problem and solution space actually exists in the grammar definition, language design, and implementation flexibility. In this project, I set out to demonstrate all three.

For instance, you may notice that there is essentially zero error-reporting in the entire library. This is because I built the grammar in such a way that most inputs that are likely to actually occur are valid! Those that aren't will typically occur at the end of the file, and thus not affect the overall utility. Similarly, LL(1) is typically lexed, but I built the grammar in such a way as to be able to do LL(1) scannerless.

To summarize, to be good at parsing, you do not need to be an academic wizard, or to know all parsing theory. You need to be aware of the typical failure modes and make your own job easier by building in the flexibility you will then immediately use. The only exception is when you're implementing a strict specification that does specify what the edge-conditions do. Thankfully, those virtually do not exist, since specification writers often are aware of this. This is why the BNF forms of formal languages are so seemingly useless – it's there for your benefit, like undefined behavior in C, except it's actually beneficial.

Conclusion

Write your parsers, it can be fun and easy if you don't make it hard for yourself. The “true” (not edited for the article) sources of this tiny ini parsing library are available here in single-header form. It also contains an implementation of parse_while, just for you, in case you happen to be building a parser that can take advantage of the same things I did (i.e. that is an LL(1)-compatible grammar that doesn't need a lexing phase; though you'll likely want to pass around a context struct instead of three bare pointers). It can also do heap allocation, so you know.

Hopefully, this has taught you something, and you'll reach for parser generators less. I do find it amusing how I ended up publishing this one before irex (a from scratch “irregular” expression educational library). It may or may not be the next article, presuming I actually finish it.

P.S. This blog will eventually move to the bunkerlabs.net domain... somewhere. There's currently an OIDC identity provider in the works that should be trivial to self-host that will be used to authenticate various services. Long story short, hopefully sometime in the next year, this blog will be ported over to there. The front page will likely mention the new location once that's a thing (there is no front page currently).

P.P.S. The point of having written this in C is to show how small and useful an LL(1) recursive descent parser can be, even in C. It should not be taken to mean that you should use C for all your parsing needs (unless you need the single codebase to be available in every language, in which case you have no choice). These techniques are applicable to virtually every language, and can result in even smaller parsers.

It is also of note that LL(1) is generally sufficient for most things you may want to do (ini was picked since it is not parsable with all the features present here with regular expressions or string splitting (as it is not a regular language), and to ensure that the implementation could be sufficiently small; if I code golfed it, it could be under 100 LOC, but I didn't want to code golf it.). Most programming languages, for instance, are designed in a way as to be parsable with an LL(k) grammar, which are provably transformable into LL(1) grammars.

 
Read more...

from Cooking with Fire

Dietary alternatives listed as subitems.

Ingredients

  • 75g rice
  • 250g mushrooms, sliced thin
  • Soy sauce
  • 1 onion
    • ½ cup grated carrots
  • 50g canned tuna (optional)
  • 100g peas (optional)
  • 100g sweet corn (optional)

Instructions

  1. Boil the mushrooms in a saucepan until they turn brown. This should take 10~15 min at medium heat.

  2. If used, sautée the onions.

  3. Take the mushrooms out using a skimmer spoon. Cook the rice in the mushroom water.

  4. While the rice is cooking, sautée the mushrooms with the soy sauce for a few minutes. If the tuna is canned in oil, take the oil from the can for this. Note that mushrooms cannot overcook, but they can be burnt.

  5. Mix all the ingredients (be it in the frying pan, a dish or a bowl) and serve forth.

 
Read more...

from Disaster Drawer

The kerfuffle was starting to wane at the periphery of my hearing, which was growing ever so closely to my immediate surroundings due to the long hours travelling from one point to the next, still on a seat that would constantly rumble and wobble due to the unstable weather. My seat neighbour would tell me that it was a common occurrence, and frequent travellers eventually got used to it, but by that point I barely had any energy to nod along with it without further worsening my sleep deprivation-induced migraines, let alone say that I couldn't have cared less. All the same, I somehow managed to make it to the other side, barely hanging on as I walked behind what felt like an endless line of people traversing a small, square hallway in the longest path physically possible. I made a vow to myself to never travel again, as I held back my tears, trying not to make a scene before dawn. As I stepped outside the birder control office, I was greeted by a mellow scent of raspberry that awakened my senses. The sweet aroma filled my nostrils, wrought havoc on my stomach and moistened my aching palate. And then, I heard a voice.

“Thank you for making it all the way here.”

Although admittedly foggy, ransacking my brain bore no result; I was certain of having never listened to that voice ever before, even if those words rang a distant bell of days past. While the more cautious side of my was tingling for caution, such was my exhaustion that I could not bring myself to care about my safety anymore. Whatever happened beyond this point, I told myself as my vision drifted away, was up to the ever-advancing hands of the clock to decide.

The first thing that I felt after that was the faint rumble of a car. I felt cramped, my knees to the height of my chest and my thighs aching; if I tried to stretch my legs, I would hit something firm with my feet, giving me no respite. I took a deep breath, and the smell of fabric came to me. Confused, I opened my eyes, and found myself lying atop someone else’s lap. I tried to spring away from her, but I was firmly held in place.

“Don’t move so suddenly! The car’s moving.”

The car? Am I being kidnapped, after all? I tried to articulate that I had very little money in my possession, but all that came out of my mouth was incoherent blabber. A finger was laid on top of my lips.

“We’ll have time to talk once you’re a little better rested.” “Is it all okay back there, madam?“ “Yes, it is. Please forgive me; my significant other just came from a day long journey.” “Sounds exhausting.” “I didn’t even want to bother with public transport at this point. But it’s okay – because he’s safe now.”

Consciousness drifted away soon after.

 
Read more...

from Cooking with Fire

Dietary alternatives listed as subitems.

Ingredients

  • 75g fusilli
  • 200g spinach, boiled
  • 3 teaspoons aioli
    • Paprika
  • Turmeric and cumin
  • 1 small can of tuna (56g)
    • 100g mushrooms
    • Soy sauce

If mushrooms are used

  1. (Skip if you're using canned mushrooms) Cover the bottom of a frying pan with a thin later of water.

  2. (Skip if you're using canned mushrooms) Boil the mushrooms on the frying pan until they turn brown. This should take 10~15 min at medium heat.

  3. Sautée the mushrooms with oil and soy sauce.

Instructions

  1. Boil the fusilli in a pot of water. Set aside once cooked.

  2. Sautée the spinach in a pan with salt, oil, turmeric and cumin. If paprika is used, add now. Stir often. This process should take 5~10 min at medium heat.

  3. Add the canned tuna or the mushrooms to the spinach.

  4. If aioli is used, mix with the fusilli. Add the spinach and serve forth.

 
Read more...

from Disaster Drawer

Golden

Debían ser las seis de la mañana cuando nos detuvimos en frente de un café de carretera situado en las afueras de la ciudad. La luz de la luna era tenue aquella noche –era comprensible, la luna nueva andaba próxima– y no había ninguna luz artificial cerca más allá de las que proyectaban nuestras bicis –que sólo se activaba en movimiento–, por lo que las luces del café despedían un intenso aura que recordaba a un ángel salvador. Podría haber sido un mensajero celestial que venía a decirnos que toda la guerra había sido un mal sueño, una broma de mal gusto que ya había durado demasiado, pero no; tan sólo era un servicio de hostelería, tan mundano y atrapado en la miseria como nosotras mismas.

Por lo menos, cuando miraba al cielo, la luz de la luna seguía siendo real. Su brillo y su sonrisa quizá no hubieran sido tangibles, pero eso no la hacía una compañera de viaje menos digna en ningún grado. Mientras miraba al cielo, no podía dejar de preguntarme si Red estaría mirando al cielo en este momento. Era muy pronto, ella dormía hasta las diez o las once de un tirón –renacuaja, cómo te envidio–, pero me gustaba pensar que al menos estaba acostada de cara a la ventana y que podríamos haber visto juntas la luna sonriente de haberse despertado.

La guerra nos había traído dolor, desolación y confusión, pero no había podido separarnos; era en estos momentos de crudeza cuando la tan llamada naturaleza humana afloraba para recordarnos una vez más que no somos de piedra, y que dos personas pueden seguir juntas sin importar dónde se encuentren. Si esto hubiera pasado antes de la guerra, habría dado por sentado que volvería a ver a Red nada más entrar por la puerta grande de casa y colarme en su habitación. Podría ver su edredón rojo de lana tejido por la abuela, cubriéndola de los pies a la cabeza, haciendo juego con su pelo… sólo de pensarlo me parece estar viéndola aún después de tantos años. Pero estos tiempos en los que vivimos me han enseñado a no dar tantas cosas por sentado, a replantearme lo que significa una decisión y a sopesar de nuevo cuánto vale una sonrisa.

Por eso llevo plastificada una foto de las dos en el bolsillo de la americana, de forma que lo último que vea antes de morir sea su sonrisa angelical.

Purple

Le dije a Golden que esperase fuera mientras entraba a echar un vistazo, solo para asegurarme de que no era un lugar peligroso. Nada más entrar, pude sentir la presencia de un sinnúmero de espíritus protectores velando por la seguridad de la dueña del lugar y de todos los clientes presentes, que tampoco eran muchos. No había un solo rincón sin proteger ni una sola taza de café sin bendecir; era, en definitiva, un lugar en el que Golden necesitaba estar.

Estaba claro quiénes eran, aún sin utilizar mis útiles de zahorí: eran todos los viajeros que habían tenido que parar por ese café en algún momento de su viaje, asegurándose con una silenciosa sonrisa de que todo el mundo recibiese una buena comida y una velada tranquila. Nunca he creído en las llamadas señales gitanas, pero podría decirse que los espíritus veladores que custodian un lugar son la señal de los clarividentes como yo.

¿Alguna vez has sentido un escalofrío o una sensación de incomodidad al entrar en un sitio nuevo? Podría ser una advertencia, un lugar peligroso en el que no conviene permanecer mucho tiempo. Pero este no era un lugar peligroso en absoluto; más bien, era un lugar que te protegía de los males del mundo exterior.

Quizá si Golden permanecía allí el tiempo suficiente, las almas veladoras serían capaces de devolverle la tranquilidad ya perdida de su alma. Quizá, si permanecía el tiempo suficiente, incluso volvería a sonreír de forma sincera. Quizá esto le hiciera más bien que las escapadas, los paisajes y la luz del sol, ahora que la industria médica se ha vuelto inaccesible para todo el mundo y no podemos llevarla ante las manos de un profesional.

– No puedes confiar ni en los bares de carretera a día de hoy –le dije mientras entrábamos–, ¡pero este lugar es muy diferente! Las grandes compañías pecan de falta de alma, suenan a hueco y te dejan con mal cuerpo cuando sales, pero este lugar, tan pequeño y modesto, te va a proteger. Estoy segura de ello.

Golden trató de sonreír. Ella nunca creyó en estas cosas; a pesar de que intentaba interesarse por mis aficiones, siempre supe que no era un interés real. Me hacía muchas, muchas preguntas sobre ello, pero ninguna de ellas era una pregunta interesante ni quería profundizar en nada. Nunca se lo dije a la cara, ya que ella ya tiene bastantes problemas con los que lidiar, pero me dolía que no le interesaran de verdad estas cosas.

A veces sentía que estábamos destinadas a encontrarnos, pero siempre parecíamos chicas de mundos diferentes. Al final del día, nunca sabía qué pensar.

Golden

Lo primero que me pedí fue un caffé doppio; no es que estuviera cansada ni me gustara especialmente el café, pero habían pasado seis horas desde el último que me tomé y no me apetecía sentirme peor. El café se ha convertido en una de las cosas que más bebo –junto con agua para poder quitarme ese mal sabor de boca, pero al menos el día sólo tiene 24 horas y me paso 10 de ellas durmiendo, así que no me va a dar una sobredosis a corto plazo.

Purple por su parte era un espécimen especial, capaz de tomarse una cerveza en cualquier momento y lugar. A veces me preguntaba si no sería como una planta que sustituye la luz del sol por el alcohol, haciendo la etanosíntesis en lugar de la fotosíntesis convencional. Estuve a punto de decírselo, pero me pareció un chiste tan malo que me dio vergüenza –y eso ya es decir mucho.

Nos fuimos a una mesa que estaba junto a una ventana. No había nada que ver, era todo noche y polvo, con acentos de luz; sin embargo, teníamos que vigilar que no les pasara nada a nuestras bicicletas, porque se había convertido en el único medio de transporte que quedaba desde que estalló la guerra. Ya no había autobuses, ya no había trenes, y todo el petróleo que se podía extraer era destinado a alimentar las máquinas, por lo que era crucial encontrar un medio de transporte autopropulsado. Si alguna vez vuelve todo a la normalidad, el mundo que nos quedará será decididamente distinto al que hemos conocido, aunque no sé si quiero llegar a ver lo que será de todos nosotros.

Había retazos de conversación flotando por todas partes. Si bien los únicos rostros que podía ver eran los de la dependienta y su hija, pude contar una docena de voces resonando a lo largo y ancho del edificio. En tiempos pasados, habría creído que había sucumbido a la locura. Ahora, ya nada me importaba. Sorbí el café y guardé silencio.

– Purple, ¿qué crees que habrá sido de nuestras hermanas? – ¿Por qué preguntas eso tan de repente? – ¿A qué te refieres, de repente? Llevo preguntando eso todo el viaje. Que me hayas ignorado no quiere decir que no te lo haya estado preguntando, y varias veces.

Purple miró hacia la mesa, hacia el resplandor dorado que rebotaba contra la caoba. Era difícil discernir si estaba cansada o melancólica.

– Procuro no pensar en ello cuando estoy despierta, la verdad –confesó–. Ya es lo bastante complicado mantenerse cuerda cuando las ves en sueños una y otra vez. Quiero acercarme, preguntarles qué ha sido de ellas, dónde están… -Tomó un trago– Por un lado sé que no son ellas de verdad, que las de verdad están ahí fuera, pero por otro lado me da miedo que esté viendo todo lo que queda de ellas.

«Todo lo que queda de ellas.»

Purple siempre había podido ver a las voces que yo sólo puedo escuchar, y me explicaba que eran las almas de las personas que alguna vez habían estado en este mundo. Ella me contaba las historias que había detrás de ellas, como si pudiera ver su pasado completo de un chasquido, por lo que yo le preguntaba si podía ver a Red y a Ginger. Pero siempre me decía lo mismo: «en sueños, siempre las veo; despierta, todavía no».

– Pero, si las ves en sueños, ¿quiere eso decir que nos están buscando?

Guardó silencio.

 
Read more...

from HellOps

Let's set the scene. I'm taking over an existing, already set up operation. The topology, per-DC, is something like this:

  • Two redundant gateways, talking via heartbeat – if one dies, the other takes over. Both of them running raid1 mdadm ext4 everything.
  • Three database servers as a percona cluster. Data is on raid1 mdadm ext4, but the base OS is on regular ext4. Var is its own partition and is xfs.
  • A bunch of application servers running standalone ext4.
  • A “backup” server whose job it is to talk to the other machines and take backups. Backups stored locally on its own disk and on an external (usb) disk plugged into it at all times. This machine actually ran UEFI, and thus had a FAT32 partition. It had an experimental btrfs partition on top of the standard standalone ext4. External disk also ext4.

Note that this is a relatively low IO use-case (high on network and compute). The XFS var partition on the DBs probably has half the disk IO use of the rack.

There's several racks like this in different enterprise colocation datacenters. All of them have shared ventilation, air conditioning, PSUs, backup generators, etc. As such, for cost saving all of them are plugged into standard electrical outlets (no in-rack PSU – there's already one handled by the colo!).

One day, there's a huge storm going around. A quarter of the city is already dark, but neither the office nor any of the colocations is. Slowly, more and more of the city's infrastructure goes down (it ended up with being closer to ½ by the end of things). Eventually, everything goes dark in the office. As are two colos. We decide the power is dead and just go home – it'll come back up. The next day, we check. One of the colocations came back up just fine. One of them, however, did not. So grabbing my winter coat (it is very cold in the DC), I head there to look at what's going on.

All of the application servers won't boot. I boot into a rescue system and check – the root partition is dead. On all of them. E2fsck won't even try to repair anything. Okay, let's check the gateways and database servers. Ext4 partitions are dead. Including the raid1 ones. The errors are different across the copies. Well what about the external backup disk? That's just completely dead. Actually fried. Will not even spin up. It was working fine the day before! Some of the drives in general are also fried, mind you – above I was talking about the ones that survived the situation.

I spend the week trying to manually recover data, since e2fsck refused. Things seem to be corrupted at random. For every file I recover, there's one I can't. What's weirder, some of the corrupted files are ones that should not have been experiencing writes at all! I was essentially flying blind (a lot of metadata blocks were also gone) so for every db file I recovered I also recovered something completely useless (like the local cat(1) binary).

At this point, I get curious, asking that DC's administration on what even happened. They say a lightning bolt hit the top of the building. Wait so the shock blew past the UPS, into the servers, frying a bunch of things? How are the motherboards ok? Why didn't the power supplies try to surge protect? I'll never have answers for these questions, though I do know that the PSUs are likely too old to have good protections in place, and the servers did not run on ECC ram (potentially explaining at least some of the corruptions, though far from all of them).

This wasn't that huge of a deal. Databases were recovered from backup (albeit a slightly older one, more on this in a second). The rest of everything just got a clean install from scratch.

What really stood out for me, however, were what survived. The fat32 EFI partition did! The XFS partitions on the database servers either survived intact, or an fsck recovered them. The experimental btrfs partition on the backup host (the source for the database recovery, a bit older because it wasn't in active use yet) had zero issues whatsoever. If it hadn't survived, an even older copy would be available from another DC's backup server (they inter-sync).

That day I learned a couple of lessons:

  1. Use logical backups of data that's important – the other stuff may make restoring as a whole faster, but actively get in the way in most other cases, while also making the backups slower, more cumbersome, and thus less likely to happen often.
  2. Ext4 will eat your data at the slightest provocation, in unpredictable ways.
  3. Lightning strikes will eat your data. Do not trust shared UPS.
  4. Btrfs can survive acts of god (something that it has consistently done for me afterwards as well!). XFS is resistant to acts of god. FAT32 is too dumb to realize what is before it is an act of god, making it similarly resistant for all the wrong reasons.
 
Read more...

from Bunker Labs

I saw people being confused about message queues, so I figured I'd implement one in shell, just for fun. This would also let us explore the common pitfalls in implementing them.

What is a Message Queue?

A message queue is conceptually simple – it's a queue. You know, like the opposite of a stack. You put stuff in it, and later you can take stuff out. The “stuff” are messages. Of course, there is a very limited use-case to having a single queue. So instead, you want to have multiple, “named” queues (i.e “collections”).

Another part of a message queue is how you access it. There is a common type of MQ (Message Queue) called “pub/sub” – in this scheme, the MQ server keeps open connections to all the “subscribers” and sends all of them the message whenever it arrives. The other one is poll-based – the queue keeps each message until it gets explicitly read by a connection, via some sort of “reserve” keyword. This latter type is what we'll be implementing.

So, we have a few basic operations to implement:

  • Write a message to a named queue.
  • Read a message from a named queue.

That's really all there is to it! So let's get to implementing.

Storage Format

We can keep a queue in a flat file. We'll call it "$queue".queue. This allows us to have almost free writes – we just append to the file. Let's not worry about networking for now and write this down in ./lib/push.

# add message from stdin to queue
cat >> "$queue".queue

This has an obvious potential issue: shell operates line-wise, so what if the message we're writing is longer than a one-liner? We'll use the base64 utility. Note that it isn't part of POSIX, but neither is nmap's ncat (which we're going to be using for networking later), with both being extremely common.

We can now rewrite the above like this:

# add message from stdin to queue
base64 >> "$queue".queue

We're still assuming that we're going to get the message via stdin (somehow), and that the queue environment variable will be populated with the queue name.

Still, this storage format is pretty simple – the messages are listed, in order, oldest first, one per line. We can guarantee the “one per line” part because we base64-encode the messages.

Reading Just One Message

We need a way to pop items off the queue. Since we can guarantee there's only one message per line, it means getting the first message (first line) and everything else separately. Let's write a simple utility ./lib/pop that will print the topmost message (decoded), and truncate the old file.

# print message
head -n1 "$queue".queue | base64 -d
# get remaining messages
tail -n+2 "$queue".queue > "$queue".cut
# move post-cut queue into the true queue
mv "$queue".cut "$queue".queue

This has a few obvious disadvantages – it's full of crash-related race conditions. It does do the job, though, so we'll keep it for now.

Networking

We're going to use nmap's netcat implementation to handle networking for us. Initially, it'll look roughly like so, in ./launch:

export PATH="$(dirname $0)/lib:$PATH"
while ncat -l 1234 -c handler; do :; done

This will repeatedly make netcat listen on port 1234. Once a connection arrives, it'll run the handler binary found in PATH. Stdin of handler will be filled with data from the network pipe, and whatever handler prints to stdout will be sent back. Notably, stderr will not be sent over.

Let's write this handler, then: ./lib/handler:

#!/bin/sh
read cmd queue
[ -z "$queue" ] && queue=default
export queue

case "$cmd" in
pop) . pop ;;
push) . push ;;
esac

exit 0

This determines our wire format. The first line sent by the server will contain the command, followed by spaces or tabs, followed by an optional queue name. If there is no queue name, we assume the name is “default”. Currently, valid commands are “pop” and “push”, which run our previously made commands in .. Finally, after handling is done, we successfully quit.

If we want to add more commands, we can do it in the case "$cmd" section later.

Trying it Out

Let's launch it! ./launch

We can connect and see how things behave:

ncat localhost 1234
pop
^D # no output

ncat localhost 1234
push
package 1
^D

ncat localhost 1234
push
package 2
^D

ncat localhost 1234
pop
^D # package 1

Well, that's fun, it's already functional! We can improve it, however.

Crash Race Conditions

We're designing a serious MQ, so we need to think about potential failures. What if the process crashes during a transaction!? We should consider this.

If the launcher loop dies, the server is dead, not much surprise there, so we can ignore it. What if we pop, but crash after sending the data back, but before truncating the old data? This is actually relatively likely with larger queues because we need to process the entire file every time. Let's fix this by implementing a rudimentary rollback journal (first, without actually using it):

CUT="$queue".cut
OUT="$queue".out
QUEUE="$queue".queue

# determine message to send
head -n1 "$QUEUE" | base64 -d > "$OUT"

# calculate remaining messages
tail -n+2 "$QUEUE" > "$CUT"

# move post-queue into the true queue
cp "$CUT" "$QUEUE"

# send the message
cat "$OUT"

# delete rollback journal
rm "$CUT" "$OUT"

We now have a multipart rollback journal. Let's say we crashed before sending the message, and wanted to manually roll back the transaction. We could do that! We would need to write a base64 encoding of $OUT to a file, then append that file with $CUT, and we would have the old state back.

It bears noting that this is not how rollback journals are typically implemented – usually they're implemented by making a copy of the data, then operating on the “real” dataset, with rollback triggering a copy back, and a commit triggering a deletion of the journal. This non-traditional approach allows us to also keep the last transaction in mind for potential repeating, since we want to avoid dropping any jobs.

Of course because we have a queue, the actual state never has to be rolled back. New writes can be added to the state with no problem, and new reads can simply use the rollback journal's data as-is. With this understanding, we can now utilize it:

CUT="$queue".cut
OUT="$queue".out
QUEUE="$queue".queue

# create rollback journal if we don't have one yet
if ! [ -f "$CUT" ]; then
    # this step is idempotent
    head -n1 "$QUEUE" | base64 -d > "$OUT"
    tail -n+2 "$QUEUE" > "$CUT"
fi

# we might have been interrupted last round
# but this is idempotent
# so always do it
cp "$CUT" "$QUEUE"

# finish transaction and delete the rollback journal
cat "$OUT"
rm "$CUT" "$OUT"

Now the operations that take a while (the head, tail, and cp invocations) are guarded by the rollback journal. The only place where corruption can occur is between printing sending the message over and deleting the rollback journal. Furthermore, the consequence of this crash would simply be a repeat send of the message (a much less disastrous consequence than dropping a message).

We didn't eliminate the crash race condition per se, we simply reduced the odds of it triggering dramatically with only a handful of additional lines of code.

Let's take a similar approach for the push operation, but with a copy-on-write (CoW) write-ahead log (WAL). The idea behind the write-ahead log is that doing a verbatim write is faster than an append with post-processing, and that we can resume the post-processing later if need be. Let's look at what kind of workflow we expect to have:

QUEUE="$queue".queue
WAL="$queue".wal

# we perform a fast write
cat > "$WAL".1

# then we do the processing
base64 "$WAL".1 > "$WAL".2

# and the appending
cat "$QUEUE" "$WAL".2 > "$WAL".3

# then we commit
cp "$WAL".3 "$QUEUE"
rm "$WAL".*

As far as the client is concerned, as long as we do those other steps later, the push is done as soon as $WAL.1 is created. The processing can be done “in the background”, between invocations. Let's write the processor wal first:

QUEUE="$queue".queue
WAL="$queue".wal

# there's a transaction to handle
if [ -f "$WAL".1 ]; then
    [ -f "$WAL".2 ] || base64 "$WAL".1 > "$WAL".2
    # we always repeat this step,
    # in case a read has already changed the queue
    cat "$QUEUE" "$WAL".2 > "$WAL".3
    cp "$WAL".3 "$QUEUE"
    rm "$WAL".*
fi

Now we can call it as a part of our launcher loop:

#!/bin/sh
export PATH="$PWD/lib:$PATH"
# process any remaining transactions
checkpoint() (
    for queue in *.queue; do
        queue=$(basename "$queue" .queue)
        . wal
    done
)
checkpoint
while ncat -l 1234 -c handler; do
    # if the handler crashed, we can catch it here
    checkpoint
done

Just as before – we didn't entirely eliminate the crash race condition. After all, the server could crash in the middle of a push. And if we added a notification for completed pushes, the notification could fail to come, while the push would happen. However, we've significantly reduced the odds of queue corruption, to the point where we can avoid worrying about it as much.

Notably, this approach results in potentially missed or corrupted writes, as opposed to potential double-writes. This is to demonstrate how that's done, as opposed to the double-read philosophy we took with pop.

Parallelism

At this point, the server is done, if we're content with a single thread! What if two clients connect simultaneously? As of currently, they can't.

I'm done messing around for today, though, so maybe a follow-up (in a branch) will be made to provide parallelism.

You can find this version of the server over on github.

 
Read more...

from Hannah's Tip Corner

After getting lost in the UHS trying to find out what I was missing, I decided to compile a few hints that seem to work for pretty much everything. I have played and watched a couple of P&C's myself, and I can say that Syberia 1 does things a little differently. For one, there's no “combination” mechanic; you can use everything as-is.

In fact, some people have gone as far as claiming that this is less of a puzzle game and more of an interactive novel. For this reason, if you have the patience to sit through endless conversations and read through documents in seek for hints, this is probably what I'd consider the easiest P&C I've played.

Table of contents

Comparing different releases

PC (Steam) release (tested on Fedora 35):

  • Regular Proton crashes; use GE instead.
  • It will only run in 800x600 windowed mode. dgVoodoo seems to do nothing via Wine, and Dxwnd will crash as soon as you try to set a higher window size.
  • Recommended to use accessibility tools (you need to read documents to get hints).
  • Includes a complete PDF walkthrough if you open the system files.
  • You need to manually hide the cursor, or it will overlap the in-game cursor. Furthermore, it seems that this can only be achieved on X11.

Switch (Syberia 1 & 2) release:

  • There are two alternative modes — touchscreen and controller. Touchscreen mode highlights all elements you can interact with at every moment. Controller mode scrolls through all elements you can interact with, but it does not show them all at once.
  • Phone will not let you use the number pad (phone numbers are added to your contact list automatically).
  • Long dialogues scroll very fast. There are several languages to choose from, so choose one you can understand by hearing. Languages available: German, English, Spanish, French, Italian, Russian, Polish and Japanese.
  • Autosave every time you do something or move from one screen to the next.
  • Bonus content you unlock as you progress (concept sketches, as far as I've seen so far).

DS release:

  • No voice acting.
  • Menu is displayed on the top screen. Touchscreen replaces the mouse.
  • Number pad is hidden.
  • Cinematics look mostly like they do on PC (it seems to be a direct port, after all).
  • Drag items to use them.
  • On the items menu, drag items below the Read sign (not onto).

Have you covered every conversation topic?

I am the kind of person that tries to get to know every NPC, but I know people who just skips dialogue and skims through. In here, you pretty much need to go out of your way to hit every conversation topic at least once. Not only they're giving you hints (duh), but some events are only triggered after hitting the right conversation topics.

Listen to the phone calls, too!

Considering phone calls come from nowhere and disrupt the flow, I wouldn't be surprised if you wanted to ignore them. However, phone calls are also important in the game, as one character will give you a hint you need to use later on.

Maybe you need to check the leaflets again!

I also tend to read every item description and every book, trying to get as much of the lore as possible for extra flavour. This is actually a common trope (2x duh), but sometimes the tips are actually in a leaflet (or a book, or a newspaper cutout, or a-) that you read a long while ago and have forgotten about since.

Talk with the NPCs again! (And listen to them again)

This game seems to be heavy on conversation; you really need to invest yourself in the characters for the plot to progress. Not only is this game lacking combination items, but it also seems to be lax in item usage altogether (at least early on). And remember – they're giving you hints, you need to listen (3x duh)!

A small puzzle spoiler, not saying which one

There is one puzzle where you will only get through after having a back-and-forth conversation with three different parties. You don't need to hit every conversation topic every single time, you will know what you need to choose, but bear this in mind before trying to look for missing items for the upteenth time.

 
Read more...