Hello
I have some success to my problem but their are few concerns,
What I did :- 1.Installed Raspbian on Pi(B+). 2.made it as a wifi hotspot using hostapd and udhcpd. 3.installed the squid3 proxy server(apt-get install squid3). 4.managed the routing through Iptables to go all traffic through proxy. 5.made the proxy transperent. 6.wrote an python script to analyse the squid log (access.log) 7.the log analyzer counts the bytes used by each ip and dumps it in sqlite db. 8.the wifi clients who exceeds the limit are put in to one acl file blockedip. 9.from squid.conf i use the blockedip acl list to restrict the access. 10.after setting the cron job i am able to block clients after
particular
data usage.
My Concerns:-
1.I am not able to bypass the one domain abc.com from proxy which should not appear in access.log
https logs may not recorded my squid.
I have tried this rules to bypass the url
acl abclan dstdomain .abc.co.in http_access allow abclan always_direct allow abclan
and this
acl abclan dstdomain .abc.co.in cache deny abclan cache allow all
but none worked and still access.log shows the abc.co.in
Note: site abc.co.in is hosted on pi with apache server and I have used the DNS binding to use it in lan network.
- How can I increase the size of access.log to maintain the logs for
more time as Pi has low storage.
Log recycling is the answer, try to push these logs to a different
computer
in the network or to cloud.
Sounds cool, any other alternative to push the data to cloud server apart from rsync ?