GraphQL Glance

This is a quick and simple glance to the raw document (in the references), maybe you could treat it as a brief note. Hope it’s helpful to u.

At its core, GraphQL enables declarative data fetching where a client can specify exactly what data it needs from an API. GraphQL is a query language for APIs - not databases.

REST vs GraphQL

• Data Fetching: multiple endpoints VS single query
• Over-fetching and Under-fetching (n+1) : fixed data structure VS given exact data
• Rapid Product Iterations on the Frontend: adjust with data change VS flexible
• Insightful Analytics on the Backend: fine-grained insights about the data
• Benefits of a Schema & Type System: type system => schema, frontend and backends can do their work without further communication

GraphQL Elixir Glance

There are several problems with the origin docs, so I reproduced this quick glance. It’s much simple and only contains the brief information. Hope this is helpful to u.

Getting Started

Schema-Driven Development

• Define your types and the appropriate queries and mutations for them.
• Implement functions called resolvers to handle these types and their fields.
• As new requirements arrive, go back to step 1 to update the schema, and continue through the other steps.

You should see two pieces of items in the links table.

Bert, Pre-training of Deep Bidirectional Transformers for Language Understanding Note

Abstract

BERT 通过联合训练所有层中的上下文来获取文本的深度双向表示。

Introduction

• feature-based：比如 ELMo，将 pre-trained 表示作为额外的特征
• fine-tuning：比如 OpenAI GPT，引入少量特定任务参数，在下游任务中 fine-tuning 所有的参数

Attention Is All You Need Note

Introduction

RNN 的固有属性使其难以并行化（即使通过 factorization tricks 和 conditional computation 可以得到改善），Attention 对依赖关系建模，不考虑输入输出的距离。本文提出的 Transformer 采用了完全的 Attention 机制描述输入和输出的整体依赖关系，在训练速度和效果上都有明显提升。

Longest Palindromic Substring (LeetCode 5)

Given a string s, find the longest palindromic substring in s. You may assume that the maximum length of s is 1000.

Example 1:

Example 2:

ERNIE Tutorial（论文笔记 + 实践指南）

ERNIE 2.0 发布了，刷新了 SOTA，而且中文上做了不少优化，这种大杀器作为一个 NLP 工程师我觉得有必要深入了解了解，最好能想办法用到工作中。

ERNIE 2.0 是基于持续学习的语义理解预训练框架，使用多任务学习增量式构建预训练任务。ERNIE 2.0 中，新构建的预训练任务类型可以无缝的加入训练框架，持续的进行语义理解学习。通过新增的实体预测、句子因果关系判断、文章句子结构重建等语义任务，ERNIE 2.0 语义理解预训练模型从训练数据中获取了词法、句法、语义等多个维度的自然语言信息，极大地增强了通用语义表示能力。