JavaScript 辅助函数,供您使用
3 分•作者: EGreg•6 个月前
你有一个函数,它从某个地方异步获取一些值?考虑用这个函数来包装它,并使它成为一个超级函数。<p>以下是它对你用它包装的所有函数所做的事情。根据我的经验,这些都非常有帮助,并且还为你提供了一个可以嵌入并稍后添加更多功能的地方(例如,透明地处理批处理等):<p>记忆化异步获取器:使用相同的参数调用一个函数,它会返回缓存的结果,而不是重新计算。<p>处理正在进行的去重:如果应用程序的多个部分在获取器仍在工作时调用相同的获取器,则只发送一个请求。其余的等待同一个 Promise。<p>限制并发:你可以限制对获取器的调用并行运行的数量。这对于 API、磁盘 I/O 或任何对速率敏感的东西都很有用。<p>支持自定义缓存后端:传递任何具有 get、set、delete 和 has 的对象。适用于 Map、LRU 或你自己的缓存逻辑。<p>可选的 LRU 驱逐:如果你传递一个普通的 Map,它会将其升级为具有最大大小的 LRU。当已满时,将驱逐最近最少使用的项目。<p>处理回调和 Promise:包装传统的回调式异步函数,但为你提供一个现代的基于 Promise 的接口。<p>智能键控:通过字符串化非函数参数来构建缓存键。适用于大多数日常用例。<p>支持手动驱逐:调用 getter.forget(...args) 以删除特定条目,或调用 getter.force(...args) 以绕过一次调用的缓存。<p>允许自定义准备逻辑:你可以传递一个 prepare() 函数来克隆或处理缓存的结果,然后再使用它们。<p><pre><code> function createGetter(fn, {
cache = new Map(),
cacheSize = 100, // 仅在 cache 是 Map 时使用
throttleSize = Infinity,
prepare,
callbackIndex,
resolveWithFirstArgument = false
} = {}) {
const inFlight = new Map();
let activeCount = 0;
const queue = [];
// 如果需要,将 Map 包装在一个简单的 LRU 中
if (cache instanceof Map) {
const rawMap = cache;
const lru = new Map();
cache = {
get(key) {
if (!rawMap.has(key)) return undefined;
const value = rawMap.get(key);
lru.delete(key);
lru.set(key, true); // 标记为最近使用
return value;
},
set(key, value) {
rawMap.set(key, value);
lru.set(key, true);
if (rawMap.size > cacheSize) {
const oldest = lru.keys().next().value;
rawMap.delete(oldest);
lru.delete(oldest);
}
},
delete(key) {
rawMap.delete(key);
lru.delete(key);
},
has(key) {
return rawMap.has(key);
}
};
}
function makeKey(args) {
return JSON.stringify(args.map(arg => (typeof arg === 'function' ? 'ƒ' : arg)));
}
function execute(context, args, key, resolve, reject) {
const callback = (err, result) => {
if (err) return reject(err);
cache.set(key, [context, arguments]);
if (prepare) prepare.call(null, context, arguments);
resolve(resolveWithFirstArgument && context !== undefined ? context : result);
processNext();
};
if (callbackIndex != null) args.splice(callbackIndex, 0, callback);
else args.push(callback);
if (fn.apply(context, args) === false) {
cache.delete(key); // 选择不使用缓存
}
}
function processNext() {
activeCount--;
if (queue.length && activeCount < throttleSize) {
const next = queue.shift();
activeCount++;
execute(...next);
}
}
const getter = function (...args) {
return new Promise((resolve, reject) => {
const context = this;
const key = makeKey(args);
if (cache.has(key)) {
const [cachedContext, cachedArgs] = cache.get(key);
if (prepare) prepare.call(null, cachedContext, cachedArgs);
return resolve(resolveWithFirstArgument && cachedContext !== undefined ? cachedContext : cachedArgs[1]);
}
if (inFlight.has(key)) {
return inFlight.get(key).then(resolve, reject);
}
const promise = new Promise((res, rej) => {
if (activeCount < throttleSize) {
activeCount++;
execute(context, args.slice(), key, res, rej);
} else {
queue.push([context, args.slice(), key, res, rej]);
}
});
inFlight.set(key, promise);
promise.finally(() => {
inFlight.delete(key);
});
promise.then(resolve, reject);
});
};
getter.forget = (...args) => {
const key = makeKey(args);
inFlight.delete(key);
return cache.delete(key);
};
getter.force = function (...args) {
getter.forget(...args);
return getter.apply(this, args);
};
return getter;
}</code></pre>
查看原文
Got a function that fetches some values asynchronously from somewhere? Consider wrapping it in this and making it a super-function.<p>Here is what it does for all the functions you wrap with it. In my experience, these are very helpful and also gives you a place you can even hook into and add more later (such as handling batching transparencly, etc):<p>Memoizes async getters: Call a function with the same arguments and it returns the cached result instead of recomputing.<p>Handles in-flight deduping: If multiple parts of your app call the same getter while it's still working, only one request is sent. The rest wait on the same promise.<p>Throttles concurrency: You can limit how many calls to your getter run in parallel. Useful for APIs, disk I/O, or anything rate-sensitive.<p>Supports custom caching backends: Pass any object with get, set, delete, and has. Works with Map, LRU, or your own cache logic.<p>Optional LRU eviction: If you pass a plain Map, it upgrades it to an LRU with a max size. Least recently used items are evicted when full.<p>Handles callbacks and Promises: Wraps traditional callback-style async functions, but gives you a modern Promise-based interface.<p>Smart-ish keying: Builds a cache key by stringifying non-function arguments. Works well for most everyday use cases.<p>Supports manual eviction: Call getter.forget(...args) to remove specific entries or getter.force(...args) to bypass the cache for one call.<p>Allows custom preparation logic: You can pass a prepare() function to clone or process cached results before using them.<p><pre><code> function createGetter(fn, {
cache = new Map(),
cacheSize = 100, // Used only if cache is a Map
throttleSize = Infinity,
prepare,
callbackIndex,
resolveWithFirstArgument = false
} = {}) {
const inFlight = new Map();
let activeCount = 0;
const queue = [];
// Wrap Map in a simple LRU if needed
if (cache instanceof Map) {
const rawMap = cache;
const lru = new Map();
cache = {
get(key) {
if (!rawMap.has(key)) return undefined;
const value = rawMap.get(key);
lru.delete(key);
lru.set(key, true); // Mark as most recently used
return value;
},
set(key, value) {
rawMap.set(key, value);
lru.set(key, true);
if (rawMap.size > cacheSize) {
const oldest = lru.keys().next().value;
rawMap.delete(oldest);
lru.delete(oldest);
}
},
delete(key) {
rawMap.delete(key);
lru.delete(key);
},
has(key) {
return rawMap.has(key);
}
};
}
function makeKey(args) {
return JSON.stringify(args.map(arg => (typeof arg === 'function' ? 'ƒ' : arg)));
}
function execute(context, args, key, resolve, reject) {
const callback = (err, result) => {
if (err) return reject(err);
cache.set(key, [context, arguments]);
if (prepare) prepare.call(null, context, arguments);
resolve(resolveWithFirstArgument && context !== undefined ? context : result);
processNext();
};
if (callbackIndex != null) args.splice(callbackIndex, 0, callback);
else args.push(callback);
if (fn.apply(context, args) === false) {
cache.delete(key); // opt-out of cache
}
}
function processNext() {
activeCount--;
if (queue.length && activeCount < throttleSize) {
const next = queue.shift();
activeCount++;
execute(...next);
}
}
const getter = function (...args) {
return new Promise((resolve, reject) => {
const context = this;
const key = makeKey(args);
if (cache.has(key)) {
const [cachedContext, cachedArgs] = cache.get(key);
if (prepare) prepare.call(null, cachedContext, cachedArgs);
return resolve(resolveWithFirstArgument && cachedContext !== undefined ? cachedContext : cachedArgs[1]);
}
if (inFlight.has(key)) {
return inFlight.get(key).then(resolve, reject);
}
const promise = new Promise((res, rej) => {
if (activeCount < throttleSize) {
activeCount++;
execute(context, args.slice(), key, res, rej);
} else {
queue.push([context, args.slice(), key, res, rej]);
}
});
inFlight.set(key, promise);
promise.finally(() => {
inFlight.delete(key);
});
promise.then(resolve, reject);
});
};
getter.forget = (...args) => {
const key = makeKey(args);
inFlight.delete(key);
return cache.delete(key);
};
getter.force = function (...args) {
getter.forget(...args);
return getter.apply(this, args);
};
return getter;
}</code></pre>