php - Multiple-filters real-time implementation with MySQL -


i'm developing php application real-estate portal, , have mysql database properties table. table has 500,000 rows , 20 columns property features. assume each feature integer.

examples of features:

  • number of rooms in apartment (1-10)
  • type of building (1-20)
  • condition of building (1-10)

i need implement web page multi-filter real-time navigation panel. idea users can select several features equal particular values. example: 1-2 rooms ("1","2"), building of 196x ("6"), state average, or excellent ("3","4","5"), in particular city.

the key requirement ability users see number of matching properties near each feature filter, taking account selected filters.

this example how should like:

rooms: 1[x], 2[x], 3[ ] (15000 more), 4[ ] (10000 more) state: bad[ ] (1000 more), average[x], excellent[x] year: 1950[ ] (19000), 1960[ ] (20000), 1970[ ] (18000) city: a[ ] (25000), b[ ] (18000), c[ ] (30000) price: <100000[ ] (20000), >100000[ ] (30000) 

the "[ ]" stands empty checkbox (i.e. filter value not selected), , "[x]" stands checked checkbox (i.e. filter value selected).

when user selects particular feature value, lets city=london, numbers should change because limited previous selections:

rooms: 1[x], 2[x], 3[ ] (5000 more), 4[ ] (5000 more) state: bad (1000 more), average[x], excellent[x] year: 1950 (19000), 1960 (20000), 1970 (18000) city: a[x], b (+4000), c (+3000) price: <100000 (5000), >100000 (6000) 

i have tried use following sql query each feature (x):

select featurex, count(*) num properties selectedfeature1=val1 , selectedfeature2=val2 ... , selectedfeaturen=valn group featurex; 

however set of these queries take several seconds, , need work in real-time i.e <200 ms on backend.

i have tried keep whole table in shared memory, unserialize() of array 500,000 records takes 1s.

the solution found develop standalone application (for example in nodejs) keeps data in memory array of objects , provides api main php application. app refreshes data mysql database every minute.

however before starting implement ask if there solution based on mysql kind of task?

if not, there purely php solution?

do have generic recommendations how approach kind of task?

not can solved in database.

you must @ 500k rows. unlikely index more small fraction of possible queries. so... suggest keep entire data in ram , have fast processing (eg c++) run through of them.

the data should (in mysql terms) tinyint unsigned char unsigned. data can stored in 20 bytes, 1 byte per feature? that's 10mb in c++, 30-100mb on mysql, 400mb in php. mysql store data in binary(20) simplify fetch @ expense of insert/update.

use mysql "source of truth" periodically load processing engine (you suggested 1 minute , node.js). focus on optimizing counting.

most languages these days 'interpretive', hence amount of overhead. c or c++ 2 of few running @ 'machine' level. i'm pretty sure can 10m calculations in under 200ms; doubt if interpretive language can.


Comments

Popular posts from this blog

gridview - Yii2 DataPorivider $totalSum for a column -

java - Suppress Jboss version details from HTTP error response -

Sass watch command compiles .scss files before full sftp upload -