c# - Replacement .net Dictionary -


given (simplified description)

one of our services has lot of instances in memory. 85% unique. need very fast key based access these items queried very often in single stack / call. single context extremely performance optimized.

so started put them them dictionary. performance ok.

access items fast possible important thing in case. ensured there no write operations when reads occur.

problem

in meanwhile hit limits of number of items dictionary can store.

die arraydimensionen haben den unterstützten bereich überschritten.    bei system.collections.generic.dictionary`2.resize(int32 newsize, boolean forcenewhashcodes)   bei system.collections.generic.dictionary`2.insert(tkey key, tvalue value, boolean add) 

which translates the array dimensions have exceeded supported range.

solutions memcached in specific case slow. isolated specific use case encapsulated in single service

so looking replacement of dictionary specific scenario.

currently can't find 1 supporting this. missing something? can point me one?

as alternative, if none exists thinking implementing 1 ourselves.

we thought 2 possibilities. build scratch or wrapping multiple dictionaries.

wrapping multiple dictionaries

when item searched have @ keys hascode , use starting number index list of wrappers dictionaries. although seems easy smells me , mean hashcode calculated twice (one time 1 time inner dictionary) (this scenario really performance cruical).

i know exchanging basetype dictionary absolute last possibility , want avoid it. looks there no way make objects more unique or performance of dictionary database or save performance somewhere else.

i'm aware of "be aware of optimizations" lower performance badly hit business requirements behind it.

before finished reading questions, simple multiple dictionaries came mind. know solution already. assuming hitting maximum number of items in dictionary, not other limit.

i go it. not think should worried counting hash twice. if keys somehow long , getting hash time consuming operations (which doubt, can't sure did not mention keys), not need use whole keys hash function. pick whatever part ok process in own hashing , distribute item based on that.

the thing need make sure here have evenly spread of items among multiple dictionaries. how hard achieve depends on keys are. if random numbers, use first byte , fine (unless need more 256 dictionaries). if not random numbers, have think distribution in domain , code first hash function in way achieves goal of distribution.


Comments

Popular posts from this blog

gridview - Yii2 DataPorivider $totalSum for a column -

java - Suppress Jboss version details from HTTP error response -

Sass watch command compiles .scss files before full sftp upload -