Wednesday, July 31, 2013

Credit Card Data Storage and PCI Compliance

Some credit card data can be stored by an e-commerce site assuming adequate security measures are implemented.


Data Element Storage Permitted Protection Required PCI DSS Req. 3.4
Cardholder Data Primary Account Number (PAN) Yes Yes Yes

Cardholder Name Yes Yes 1 No

Service Code 1 Yes Yes 1 No

Expiration Date 1 Yes Yes 1 No
Sensitive Authentication Data 2 Full Magnetic Stripe Data 3 No N/A N/A

CAV2/CVC2/CVV2/CID No N/A N/A

PIN/PIN Block No N/A N/A

1 These data elements must be protected if stored in conjunction with the PAN. This protection should be per PCI DSS requirements for general protection of the cardholder data environment. Additionally, other legislation (for example, related to consumer personal data protection, privacy, identity theft, or data security) may require specific protection of this data, or proper disclosure of a company's practices if consumer-related personal data is being collected during the course of business. PCI DSS, however, does not apply if PANs are not stored, processed, or transmitted.
2 Sensitive authentication data must not be stored after authorization (even if encrypted).

3 Full track data from the magnetic stripe, magnetic stripe image on the chip, or elsewhere.

Summary

The card holder data where storage is permitted must be encrypted per PCI requirements.

Furthermore, the entire system hosting the data store must adhere to PCI requirements.

If your site stores credit card information, then your site is a target.

Alternatives include storing credit card information at your processing gateway or using a Tokenization service to actually store the credit card information.

References

https://www.pcisecuritystandards.org/documents/pci_dss_v2.pdf
https://www.pcisecuritystandards.org/documents/pci_ssc_quick_guide.pdf
http://en.wikipedia.org/wiki/Tokenization_(data_security)

Thursday, July 11, 2013

Better Coding Techniques Lower RubyMine's Runtime Memory Requirements

For some coding tasks, I use RubyMine.

RubyMine has great features, but is known for being sluggish and hogging RAM.

Sometimes, tweaking RubyMine's memory configuration file will improve your performance.

RubyMine Memory Configuration

Here're the configurations that typically work well for me:

-Xms128m
-Xmx1024m
-XX:MaxPermSize=500m
-XX:ReservedCodeCacheSize=128m
-XX:+UseCodeCacheFlushing
-XX:+UseCompressedOops


If you have installed previous versions of RubyMine be sure that you're configuring the correct idea.vmoptions file

Search your file system with something like this:


sudo find / -maxdepth 15 -name "idea.vmoptions" -type f 2>/dev/null)


If RubyMine starts to run out of memory it will warn you and give you a dialog window that will directly modify the correct .vmoptions file.

And as with just about everything, what you should set yours to depends...

Impact of Coding Techniques

Your coding techniques will also affect your requirements.

For example, if your code is littered with Model.each do ... constructs, then RubyMine will appear to be a memory hog.

However, if you simply replace your Model.each do's with Model.find_each do's then you'll likely see that RubyMine (or whatever Rails runtime you're using) is no longer gobbling up so much of your computer's memory.

The reason is that the Model.each do ... construct, ActiveRecord will run a query for all the records and load the results in memory; Whereas, User.find_each(:batch_size => number_of_records_in_batch) do ... only loads number_of_records_in_batch at a time.

Verification

Don't take my word for it; Rather, open up the Rails console and try it out yourself:

Using .find_each


>> User.find_each(:batch_size => 100) do |user| end
  User Load (9.3ms)  SELECT "users".* FROM "users" WHERE ("users"."id" >= 0) ORDER BY "users"."id" ASC LIMIT 100
  User Load (5.3ms)  SELECT "users".* FROM "users" WHERE ("users"."id" > 100) ORDER BY "users"."id" ASC LIMIT 100
  User Load (6.2ms)  SELECT "users".* FROM "users" WHERE ("users"."id" > 200) ORDER BY "users"."id" ASC LIMIT 100
  User Load (4.6ms)  SELECT "users".* FROM "users" WHERE ("users"."id" > 300) ORDER BY "users"."id" ASC LIMIT 100
  User Load (4.0ms)  SELECT "users".* FROM "users" WHERE ("users"."id" > 400) ORDER BY "users"."id" ASC LIMIT 100
  User Load (4.1ms)  SELECT "users".* FROM "users" WHERE ("users"."id" > 500) ORDER BY "users"."id" ASC LIMIT 100
  . . .

Using .each


>> User.all.each do |user| end
  User Load (416.9ms)  SELECT "users".* FROM "users"
  EXPLAIN (0.6ms)  EXPLAIN SELECT "users".* FROM "users"
EXPLAIN for: SELECT "users".* FROM "users"
                               QUERY PLAN
------------------------------------------------------------------------
 Seq Scan on users  (cost=0.00..1253.72 rows=7372 width=2726)
(1 row)


Your first clue that using .each could be a performance bottleneck is the fact that the RDBMS felt like it was necessary to EXPLAIN why that query was taking so long.

So, take it easy on RubyMine and write better code.