best way to read lines in text file into python dictionary -


i have file on 15k lines each line having 1 key , 1 value. can modify file content if formatting required faster reading. have made entire file dict , doing eval on best way read file or better approach can follow, please suggest. file mymapfile.txt:

{ 'a':'this', 'b':'that', . . . . 'xyz':'message can have "special" char %s etc ' } 

and on file doing eval

f_read = eval(open('mymapfile.txt', 'r').read()) 

my concern file keeps growing , values can have quotes,special char etc need wrap value ''' or """. dictionary format if there small syntax error eval fail. better use readlines() without making file dict , create dict or eval faster if make dict in file? readlines can write text in each line split : , need not worry special characters

file readlines:

a:this b:that . . . . xyz:message can have "special" char %s etc 

@mahesh24 answer returns set values dict not. variable overwrites builtin dict. rather use 2 lines:

s={ (i.strip())  in open('ss.txt','r').readlines() } d = {i.split(':')[0]:i.split(':')[1] in s} 

d dict read in values. bit of thinking 1 liner. pretty sure there read csv in python standard library give more options , robustness. if data in other standard format using appropriate standard libraries preferential. above 2 liner give quick , dirty way of doing it. can change ":" commas or whatever separator data has.


Comments